Key takeaways
- Enterprise AI and BI spending is at an all-time high, yet only 8% of employees use advanced analytics regularly
- The biggest blocker to data and AI value is not technology, it's organisational readiness, skills, and culture.
- Legacy systems and poor data quality remain the most cited blockers, yet organisations continue to prioritise new technology spend over fixing the fundamentals
- ROI from data and AI is rarely measured clearly upfront, and ownership of outcomes is poorly defined across the business
- A 'start small, prove value, scale' approach consistently outperforms large-scale technology-first investment
- Data literacy and culture programmes are under-invested but they are the key to unlocking the value already spent on platforms and tooling
- Framing data initiatives through business outcomes instead of outputs is the single most powerful shift data leaders can make
The billion-pound question
Every year, enterprise organisations pour ever-larger sums into data and AI. Global AI spend is projected by Gartner to reach $1.5 trillion in 2025 alone. Business intelligence investment is growing at over 10% year on year and shows no sign of slowing. The message from the boardroom is clear: this is the future, and we need to be in it.
But the biggest gap still remaining is this: for all that investment, most organisations are not seeing the return they were promised.
Only 8% of employees use advanced analytics on a regular basis. A staggering 68% of employees cite a lack of AI skills. Yet 24% of companies are planning to triple their advanced analytics spend in the next twelve months, and 89% of organisations are expected to adopt generative AI by 2027.
The maths simply does not add up, and it is time to talk about why.
The current moment in data and AI mirrors what happened with big data a decade ago. The hype and investments are real, but for many organisations, the readiness is not where it needs to be to deliver.
Plenty of businesses are still operating with the same organisational data maturity they had in 2019, or earlier. Decisions are still being made on gut feel and data teams are still isolated from commercial strategy. The culture of data-informed decision-making as the standard is still often lagging. And yet, the proposal is to build sophisticated AI products on top of this shaky foundation.
"You're driving the train while laying the track. I don't think I've seen an organisation be able to pause the train, lay the track perfectly, and then drive merrily to the next stop."
— Jason Foster, Founder & CEO, Cynozure
Hype is not inherently a bad thing. Astute data leaders have always used it to unlock budget, drive interest, and accelerate change. The risk comes when expectations are set so high that no delivery could ever live up to them, leaving organisations disillusioned, and data teams on the defensive.
We are now at that inflection point. Eight out of ten client engagements we see right now share the same problem statement: significant investment has been made, but the value has not materialised. Now what?
Technology is not the blocker you think it is
Ask enterprise organisations what is holding them back from moving faster with data and AI, and the answers are consistent: legacy systems and outdated infrastructure, followed by data quality and governance challenges.
On the surface, this might look like a technology problem. But look more carefully, and a different picture emerges.
Legacy technology persists not because organisations cannot afford to replace it, but because the act of replacing it is far more complex than anticipated. Systems that were supposed to be retired turn out to be deeply embedded in operational processes financial calculations, supply chain integrations, line-of-business applications. You start pulling the thread and find you cannot unpick the spaghetti. So you build on top. Legacy upon legacy.
"The people who knew the legacy system have gone, and no one else dares touch it. And then there's just a mess of data and quality inside those systems."
— Jason Foster
Data quality and governance present an equally frustrating challenge but again, not primarily for technological reasons. The reality is that the vast majority of data in most organisations is created by human beings which makes it a human problem. Technology can support better governance, but it cannot manufacture the culture, accountability, and understanding that make data trustworthy. The more literate a business becomes, the more its data quality improves because people understand why it matters.
The ROI blind spot
Perhaps the most striking finding from industry research is this: fewer than 6% of large organisations cite an unclear ROI or business case as their primary blocker to data and AI progress.
That is not because ROI is easy to demonstrate. It is because most organisations are not treating it as their responsibility to demonstrate it at all.
Data teams are typically held accountable for building capabilities, from deploying technology to cleaning data and training teams. They are developing the muscles. But someone else is responsible for the commercial outcomes that those muscles are supposed to deliver. This disconnect in ownership is at the heart of why so much investment fails to translate into value.
"Data functions aren't particularly held to account upfront for the impact the investor is looking to achieve. There's been a leap of faith that if we invest, good things will surely result."
— Jason Foster
A more effective approach is what we call 'return on data investment' or RODI. Rather than investing in broad capability-building and hoping value will follow, organisations identify a specific business problem, define what success looks like in commercial terms, and then measure against it. That line of sight is what earns the right to further investment and builds trust with the leadership team.
Think of it like building a bridge. You know why the bridge needs to exist. You know who will use it and what it will enable. You can quantify the value before a single stone is laid. Data and AI projects should work the same way but all too often, they do not.
The Elizabeth Line principle: Know your value before you start digging
A powerful analogy here is the Elizabeth line in London. Before construction began, economists and planners had calculated the anticipated value: faster journeys, increased footfall, higher commercial activity around new stations, better access to underserved communities. There was a business case, not just an engineering spec.
And when the line opened, it delivered benefits that had not even been anticipated: new retail, housing development, economic regeneration. Those halo benefits were a bonus, not a surprise failure. Because the core value was understood and committed to from the outset.
Contrast this with HS2, where the goalposts shifted, the scope expanded, and the value case became increasingly difficult to defend. Many data and AI programmes look more like HS2 than the Elizabeth line.
The lesson is not just for data teams. Commercial leaders (CFOs, CEOs, sales directors) need to be co-authors of the value case, not passive signatories. When a data project delivers a 7.9% uplift in revenue, as one of our clients achieved after building a personalised pricing model for contract renewals, that story should be owned by the whole commercial leadership team, not just the data office.
That retailer had a specific problem: margins being squeezed by energy costs, with blanket renewal pricing that was no longer sustainable. The data team did not build a platform for its own sake. They built a pricing model tied directly to a business problem. The story they told was not about dashboards and models — it was about closing a margin gap and improving revenue.
That is the standard every data team should be held to.
The Supply and Demand imbalance nobody talks about
There is a concept that rarely surfaces in budget conversations but explains a great deal about why data investment underperforms: the gap between capability supply and capability demand.
Most organisations are investing heavily in supply, from platforms, tools and models to dashboards. But they are investing almost nothing in demand: the organisational appetite, understanding, and skills needed to actually use those capabilities to solve real business problems.
The result? Data products are pushed out to a business that has not asked for them. There is not enough pull from commercial teams, enough curiosity from senior leaders, enough understanding of what is possible. The data team is essentially broadcasting to an audience that has not tuned in.
A telling example: a CFO and COO were in the process of making a significant strategic shift in their market position. When asked at what point they had engaged the chief data officer, the CFO's response was: 'That's a business problem, why would I have spoken to the CDO?'
That response is not unusual. It is the norm. And it will remain the norm until organisations invest as seriously in building data literacy across the business as they do in building data capability within the data team.
The People Problem: Why culture cannot be an afterthought
Gartner projects that by 2027, more than half of CDOs will secure dedicated funding for data and AI literacy programmes, driven by enterprise failure to realise generative AI value. That figure tells you two things: it is coming, and it has not happened yet.
Currently, only 16% of organisations are prioritising data literacy spend, despite 66% of CEOs recognising culture as a significant challenge. The gap between recognition and action is vast.
Why? Partly because literacy and culture feel softer and harder to justify than technology. Partly because organisations worry about training people before the infrastructure is ready for them to use. And partly because the budget has already been spent, £75 million on a technology platform leaves precious little for the human side of transformation.
But the organisations that get this right are the ones that plan the people piece in parallel with the technology piece, not as a follow-up fix, but as an integrated part of the programme from day one. A warm, ready audience when a new platform launches is not a nice-to-have. It is the difference between adoption and shelfware.
"The best programmes we've run are the ones where the people, skills, and culture piece has been considered in direct alignment with the deployment of the data platform — not as an afterthought."
— Greg Freeman, CEO & Founder, Data Literacy Academy
One large retailer demonstrated the power of radical commitment: they switched off their old BI tool on a Friday and switched on the new one on Monday. No parallel running, no extended transition. It was brave, even brazen. But it drove adoption because it left people no other choice. Cultural bravery of that kind is rarer than it should be.
How to Build a Balanced Data and AI Budget
If your current budget allocation looks something like 80% technology, 10% people and skills, 10% change management, you are likely not going to achieve the outcomes you are hoping for. Here is a more balanced framework to consider.
Before you allocate a single pound, ask:
- What specific business outcomes are we trying to achieve, and who in the commercial leadership team co-owns those outcomes with us?
- Have we agreed upfront with our executive peers what success looks like and what it would mean to say the investment has paid for itself?
- Have we built in a skills and culture uplift plan that runs in parallel with technology deployment, not after it?
- Do we have a clear way to measure business outcomes (not just data outputs) downstream?
- Have we considered appointing a value realisation professional, someone whose explicit job is to connect data investment to commercial results?
- Are we investing only in initiatives where we know what value we are trying to achieve or are we building capability in hope that value will follow?
- Is our budget driven by the data office, or is literacy and culture spend being left entirely to L&D with no skin in the game from the CDO?
Start small. Prove value. Then scale.
The organisations that generate the most sustainable return from data and AI are rarely the ones with the biggest budgets. They are the ones that pick a specific, commercially meaningful problem, solve it with data, demonstrate the return, and use that success to earn the right to do more.
This approach requires a shift in mindset, from 'we need to build the infrastructure before we can show value' to 'we need to show value to justify the infrastructure'. It requires data leaders who understand how a P&L works. It requires commercial leaders who understand what data and AI can actually do. And it requires both sides to co-own the outcomes.
The satisfaction gap, the distance between expectations and reality, is where most data programmes go wrong. Set modest, credible expectations. Deliver against them. And then let the results speak for themselves.
That is how you build trust, earn budget, and ultimately transform an organisation's relationship with data.
About the speakers
Greg Freeman is the CEO and Founder of Data Literacy Academy, which partners with enterprise organisations to roll out data and AI literacy and culture programmes at scale. Jason Foster is the Founder and CEO of Cynozure, a consultancy specialising in translating data and AI investment into clear business impact. Jason is also the author of Data Means Business.
WEBINAR TRANSCRIPT: Why Your 2026 Data Budget Is Setting You Up to Fail
Speakers: Greg Freeman (CEO & Founder, Data Literacy Academy) and Jason Foster (Founder & CEO, Cynozure)
---
GREG: Okay, so we will get into it. Why your 2026 data budget is setting you up to fail. A little bit provocative, but certainly some food for thought, and we'll try and cover as much of that as we go through the session.
To start with, my name is Greg Freeman. I'm the CEO and founder of Data Literacy Academy. We are an organisation that partners with enterprise organisations to help them roll out data and AI literacy and culture programs at scale. Our specialist area is on the business side of data. We don't do much work with data professionals. For us, it's about how do we bring the wider organisation — the 97, 98% of people who are not already on the journey and are not already experts — on their own data journey, and bring them up to speed from a mindset, behaviours, and skills perspective.
I'll hand over to Jason to introduce himself.
JASON: Hi, thanks Greg, and hi everyone. Lovely to have you here. Looking forward to the session. I'm Jason. I'm the founder and chief executive at Sinnour. I'm the author of Data Means Business. At Sinnour, we're a consultancy that helps leadership and their teams ensure that data and AI properly translate into clear — hopefully P&L impact, but impact in some way depending on what industry you're in. In our experience, companies can spend a lot of money on platforms and teams and models and experiments and all sorts of things, and then struggle to turn that into something they would call ROI, or faster decisions, or better business outcomes. We're normally brought in to help solve that. Looking forward to the discussion with Greg.
GREG: Jason's so much better at introductions. He remembers that he's an author.
JASON: You're so new to it. I've had the book for a while now.
GREG: Yes, promotion of Data Means Business is a very good lens to take.
In today's session, we'll work through some slides, but the thing I want to get across most is mine and Jason's experience, opinions, and thoughts on the topics — not just presenting slides. I'll talk through what's on a slide and then ask Jason for his view, and I'll feed in my own. We'll go through the current and future investment landscape, how your budget might be blocking the value realization Jason mentioned, and ideally how to build a balanced budget model — which I'd say is something I've not seen done that well across the industry.
The problem we're here to discuss — and I think it's a key one, very relevant right now — is that every enterprise is buying straight into the hype cycle around AI. There are a few that are locking the doors, but in general the premise is: how can we deliver AI at scale faster, and how can that deliver value? It's a similar story in some ways to the data hype cycle ten years ago. As a lot of people know, that hasn't panned out as expected. Part of the reason we need to critically think about the AI hype cycle is that many organisations are still stuck in a position of organisational readiness that they were in in 2019 — if not earlier. Some organisations haven't moved that far in the last decade around how they're collecting, aggregating, and using data for decision-making. So the idea of building AI tools, AI products, AI value on top of that scenario is a really difficult one, because we all know: great data, better AI; not very good data, less good AI.
So, AI spend is booming. Any number that begins with a T is a big one. The 2025 projected total global spend from Gartner is $1.5 trillion. Similarly, BI spend is booming — a 10.74% projected spend growth rate year on year through to 2033. But when you get into the details of who's actually using this stuff, it isn't reflective of that spend. Only 8% of employees are using advanced analytics on a regular basis, yet 24% of companies are going to triple their spend on advanced analytics in the next 12 months. The math doesn't add up in terms of adoption and value being realized. Skills are still a major inhibitor: 68% of employees cite a lack of AI skills, and 89% of organisations are expected to adopt generative AI by 2027. If 68% of employees are saying they don't get it and haven't been brought on the journey, yet 89% of businesses are throwing cash at getting this deployed across the enterprise, there's a clear disconnect in how you're going to generate value and drive adoption.
Jason, taking that problem statement — the hype cycle, business maturity and readiness, and significant investment in technology and tooling — where does that leave you, and what are you seeing?
JASON: The hype and the level of investment are connected. The hype drives the investment, and investment drives the hype. In the AI space, there's talk of a bubble bursting. What I don't think will burst is the appetite to make this work and make it successful. Hype can actually be a good thing — certainly for data teams trying to drive interest in what they're responsible for, drive some budget, and get investment done. You can use hype in a really positive way. But there's a danger that you overuse it, set expectations so high, and then it's really hard to deliver against them. It's about getting that balance right.
Back when I was in industry — when big data first came out — I used that brazenly when I was at M&S to drive interest in what we were doing, get innovation budget, do things in different ways because it had this newness about it. But then it was about getting on with it and delivering strong, impactful stuff for the organisation — not just spending the cash I'd managed to get.
The level of readiness is really interesting because you're never fully ready for what's coming. You're driving the train while laying the track — quite a hard thing to do. I don't think I've seen an organisation be able to pause the train, lay the track perfectly, and then drive merrily to the next stop.
GREG: I'll give you a chance to think while I talk. But the other thing I see a lot of is a massive appetite from data offices to spend money on technology — they don't see that as a concern. What I see less of is willingness, appetite, and bravery for radical change.
A really interesting use case with one of the large retailers: they basically picked their new technology stack, their new business intelligence tool, and just decided they were going to switch off the old one on a Friday and switch the new one on on a Monday, and see how it played out. That worked really effectively for them. It was very brave, very brazen — but it was radical change at scale, which delivered adoption because people had no other choice. I see a lot of spend on a new BI tool, a new data platform, but not the willingness to just switch off the old one and let's see what they actually need. There is a disconnect between the amount of money people are willing to spend on technology and the cultural bravery they're willing to take to actually get it adopted.
Any examples from your background?
JASON: Just to reflect on that point — budgets are about choices and priorities. Often budget goes on technology and implementing technology. If your budget reflects that choice, then you haven't left anything for how you actually make this impactful in the organisation — embedding it in the way the organisation works, in people's workflows and processes.
On organisations where they've ridden the hype and not delivered — this is the common problem statement right now, not just in the UK but across Europe and the US. People have been able to get budget on the back of high expectations without huge commitment of what it will deliver — just a leap of faith from organisations saying, well, we know there's some value here, let's invest and build the capabilities. Now we're in a phase where that investment has been made and it's showtime. People are turning to things like data culture — if we can change the culture, they'll start using the thing — and then realising it's hard to change culture without actually doing something. They're looking at different techniques to manifest value that should have been baked in from the start. Eight out of ten pieces of work we've done in the last twelve months are that problem statement: spent, invested, not got the value — now what?
GREG: One of my clear observations over the last few years is that people see the people, skills, and culture piece as either an afterthought or a fix to a problem they've created for themselves. Whereas the best programs we've run are ones where those things are considered in direct alignment with the deployment of the data platform. Some people ask, "Is that not cart before the horse?" No — because you actually need a warm, ready audience when you're spending the kind of money on the screen. If you're looking for a twelve-month window to follow that massive spend, hope is not a strategy. A lot of people's adoption strategy is hope, or investing more money with a big consultancy. Neither works that well.
So the biggest blockers to moving faster with data and AI, according to aggregated research across the industry — for organisations over £500 million in revenue — are legacy systems and outdated technology infrastructure, followed by data quality, governance, and access challenges. Jason, why is it that in a world where businesses are spending these numbers on technology every year, the narrative from the industry is still that technology is the blocker?
JASON: What tends to happen is new technology gets deployed and old technology doesn't get retired. The plan is to replace something, but as you get into it — particularly in mature sectors like financial services and insurance — you start to find that the old thing is critical somewhere and you can't change it. You end up building on top. So you build legacy upon legacy.
A client we worked with recently: the project was in their minds to replace their old data warehouse with a new cloud-based data platform. But when we got under the covers, this data warehouse wasn't simply a reporting warehouse — it was an operational platform where financial calculations were being run. It was integrated with Salesforce, supply chain systems. You just couldn't unpick it. They could replace the bits around insight and analytics, but not the other bits. So you end up with a complicated, convoluted estate.
The other challenge is that people leave. The people who knew the legacy system are gone, and no one else dares touch it. And then on data quality and governance — it's not just the legacy tech that causes the problem, though the two often come together. There's also just a mess of data and quality inside those systems.
GREG: I find the conversation around data quality and governance one of the most frustrating — and one of the hardest challenges to solve. Governance leads and heads of data governance have got the toughest job, because it's such a horizontal problem across the whole organisation. You've got to do inspiration, effective frameworking, delivery, education — it's a multifaceted role.
The conversation I hear too often is that quality and governance can be solved through technology. But 95% or more of data in most organisations is created by human beings who interact internally or externally with them. Data quality and governance is a human problem. The more literate a business gets, the more quality improves, because people actually care about it and understand the value it's going to drive.
The other thing that blows my mind about this data: the biggest blocker is NOT listed as an unclear ROI or business case for investment — 6% of people in businesses over £500 million, and 0% in smaller organisations. Yet halfway up you've got leadership alignment, and higher up you've got budget and resource constraints. There's a massive disconnect. Jason?
JASON: I think that points to the mindset of the industry around what its responsibilities are. I don't think it says unclear ROI isn't a blocker — I just don't think people necessarily see ROI as their responsibility. There's a question about accountability and ownership for ensuring that investment produces results.
It gets looked at the wrong way round — in terms of technology, data, people, training, the capabilities needed. Data leaders and CDOs typically own the budget for building the thing: bringing in technology, engineering, cleaning data, training the organisation. They're developing the muscles. But then they're judged on the return that's delivered when someone else has got the responsibility to implement the actual change. It's a disconnect with ownership.
Also, data functions aren't particularly held to account upfront for the impact the investor is looking to achieve. There's been a leap of faith — because of the hype — that if we invest, good things will surely result, without a clear understanding of what the whole organisation would need to do to get a positive conclusion.
We use the phrase "return on data investment" — RODI. For every pound, what's the ROI we're trying to achieve? And then we measure against that. That line of sight then allows you to understand whether you've been successful, and earn the right to get more budget to continue the cycle.
GREG: Our VP of Products and Strategy often gives the analogy of a bridge. You spend money on a bridge — you know what you're trying to achieve: getting a person from island A to island B. As long as it holds up and is safe, you've achieved the expected value. The problem with data and AI is that for the commercial leaders saying yes to the budget, they're often taking a punt. The industry and its influencers — Gartner, McKinsey, the Big Four — all say that data and AI can deliver value. But because there isn't a clear bridge taking people from point A to point B, it's blurry what the value actually looks like. The job of the data office and data leader is to build that bridge and give people something clearly understood — where they can say, "I understand how the numbers work in my business, and this data project has delivered numbers."
JASON: I'd go a step further — it's not just the responsibility of data people to build the bridge, but also to explain the value of having it, and what being able to get from A to B will get you.
The analogy I've used recently is the Elizabeth line. There was value attributed to it upfront: more people getting around London quicker, more footfall, more spending in shops. There'll be a spreadsheet somewhere that calculated the impact and the cost. But there are also all the halo benefits — new shops built around the stations, better access to areas that weren't accessible before, more communities, more housing. They knew what value they were looking to achieve before they started digging. And there will have been benefits since it was built that they didn't even think of. You've got to know why you're building the bridge and what being able to go from point A to point B will ultimately allow you to do.
I think we've got to get much better at being clear on that — and only investing in things where we know the answer.
GREG: And to your Elizabeth line analogy, it won't have been the engineer who did all that analysis. They'll have partnered with economists and taken commercial insight from outside the data office — which is something data offices can be really poor at. Meaningfully engaging executive peers and saying: do you agree with the maths on this value that will be returned? Do you sign this off? If it delivers that value, we're both celebrating. If it doesn't, that's also on both of us. A lot of data projects are more HS2 than they are the Elizabeth line, and may never actually get to value.
Rich in the Q&A has asked a great question — I think he's saying, essentially: are we getting too much budget too soon, which sets expectations too high and puts too much pressure on returns because of the investment made? It's a really good point. I'm a big advocate for starting small, scaling, and only investing in things proven to be valuable.
You can only do that if what you're investing in is the delivery of value. If you're investing in infrastructure, you're always in this cycle of: I need to finish building the infrastructure before I can show value. But if you're working on a specific business problem — say, customer retention being down in a certain market segment — you have almost no problem demonstrating ROI because you're directly working on a business problem that needs fixing.
Sometimes, frankly, people take a lot of pride in saying they own a certain amount of budget and a certain number of people. There's almost a kingdom-building mentality that then sets you up to fail. If you go in and prove that you can do a lot with not very much, you'll have the keys to the castle. People will say, "That person came in eighteen months ago, solved a real problem, it didn't cost us the earth, it's cheaper than what my peers in the Gartner network are spending on data and AI." You've set yourself up for a much better position.
JASON: There's a point here about how much we spent on data and AI versus how much return we got — it's too broad a question in some ways.
We were brought into a logistics and storage company with quite a specific problem statement. The work we did was data work — building a model, building some dashboards, showing insight. But the problem statement was that their margins were being squeezed because of energy costs to customers. Typically they'd done a blanket annual renewal across everybody, and the question was: how do we do these renewals without further eating into our margins?
The work we did was to solve that problem — not build a data platform and build dashboards for the sake of it. We worked with the commercial team as part of the team. We built a pricing model with personalised pricing based on customers and their usage — quite a complex model — and gave them scenarios they could overlay into their renewal process through sales. That year they ended up 7.9% up on revenue and closed the margin gap.
Depending on how you tell that story: either we went in and built a data platform and some models and trained their team — or we closed the margin gap and improved revenue for the year. The data industry needs to get to a place where it can articulate the problems it solves through the lens of business challenges, not data challenges.
GREG: Most people on this call who might sign off on a project — all I heard there was: it's a proper business problem, it's a business-focused solution, and a 7.9% uptick. That's what I hear. I don't care what was done underneath. What I hear is: I understand the problem, we're building a solution for it, we have a way to measure it, and that measurement has delivered value. Not enough of that goes on.
So: how are you balancing your budget for capability supply — the technology and tooling — with your budget for capability demand? A champagne problem I talk about a lot is the idea of over-demand. What if we can't meet the demand with supply? There is about one data team in the world in that position. Most people are spending all their money on the supply side and not the demand side — spraying and praying data products out to a business that hasn't asked for them. There isn't enough demand coming from the business to solve real problems.
I had an eye-opening conversation with a CFO recently. The CFO and COO were in the process of making a material shift in their market and product position. I asked at what point they'd engaged the CDO with that problem. The CFO's words were: "That's a business problem — why would I have spoken to the CDO?" How are we still at a point where, when commercial leaders are making material strategic shifts, they still don't go to the data team first? If you look at your budget and you've spent ten times more on tech than on the people piece — is it delivering the return you need?
Jason, at this time of year you're presumably having conversations with CDOs about what they're going to spend money on next year. With these stats on screen — by 2027, more than half of CDOs will secure funding for data literacy and AI literacy programs, fuelled by enterprise failure to realise generative AI value (Gartner), versus only 16% currently prioritising data literacy — are you seeing a changing of the guard, or still new strategy, new platform, new AI technology budget?
JASON: It's a real mix. And that disconnect — lots of people recognising it's a challenge, but not prioritising spend on it — we see that in our data leadership survey results, and early results from this year look the same.
The reason people want to recognise it as a challenge is because money's been spent and the benefit hasn't been realised. So: let's educate the people who can now take what we've built and do something with it. Sounds sensible. I don't fully understand why not that many people are prioritising the spend, to be honest. Education is actually quite an easy place to show progress — there's huge demand from employees to be trained, particularly with AI wrapped around it. It's a good recruitment tool. So it makes a lot of sense.
What's probably pushing it down the list is the perceived need to prioritise investment in foundations — legacy tech, data governance. Those things get prioritised because people are worried about training people up before they're able to do anything with it.
There's also a matter of pace — the speed at which you educate people versus the need for them to be able to use that learning in real time. Otherwise it was a nice course and not much came from it.
One other thing: there's a difference between data literacy and data culture. Education and skills are one element of creating the culture you need. Culture change is broad and requires both top-down executive sponsorship and bottom-up change — people on the front line using data and decision products to transform something day-to-day. You don't get culture change without actually seeing the change happen.
GREG: I hear these conversations in real time every day: "We've spent £75 million on the technology platform and we've got no budget left for the people." And I think the problem started at that budget conversation eighteen months ago. I think I could have forecast those challenges without any technology or AI.
One of my observations is that people see the people-and-skills piece as a fix to a problem they've already created. Whereas the best programs we've run are ones where it's considered from the start, in direct alignment with the deployment.
I probably don't believe the Gartner 2027 vision at the moment. I think it's more likely that people will try and fix this problem the same way they always have — different consultancy, different technology platform. But I live in hope.
JASON: I do think there's quite a difference between AI literacy and data literacy. Data is more about understanding how to make a decision based on data sets, what's happened historically, using that to derive predictions. AI — particularly generative AI — is quite different. Both are required, but you probably need something slightly different in each case.
GREG: I think if the industry's dependency becomes generative AI, it's going to solve some real operational pains people face every day, make people more efficient, earn some time back. But fundamentally, it's still going to be insights-led work from data and more sophisticated AI that solves the big problems businesses face. I hope we're not going to get so obsessed with generative AI — because it has a "I can feel the impact tomorrow" type of feeling — and forget that the real big problems will come from good data insights and good machine learning.
So: have you truly got horizontal peer support for these things? Shared accountability — not just shared ownership of data in the functional sense, but is the whole leadership team truly behind knowing where value will come from, and knowing they're part of achieving it together?
Not enough CDOs are willing to spend their own budget on literacy. If your budget for literacy is only an L&D budget, that's a bit of a problem. We typically see programs are more sustainable and deliver more value when they're driven by data office budget — because it's a data office problem that's going to create the change and allow the value to be extracted. And think about your relationships with key stakeholders. If you look in the mirror, do you truly believe your answer to these questions is the right one?
Jason told a brilliant story earlier that would fix a lot of the problems around shared accountability. I imagine in that business where there was a 7.9% uptick, there were loads of C-suite members who wanted to put their name to that program.
JASON: It's a coalition. You can't double bubble — you can't say everyone earned that benefit, it just doesn't stack up that way. But where I see this work best is where there's a grown-up CFO and supporting groups who can have a mature conversation about working towards a common goal. Then the attribution of value becomes a lot clearer.
It depends on where you are in maturity. If you're early stage in terms of data investment, the RODI metric becomes a tool to articulate the value you're trying to achieve and win budget. If you're at the other end — a data-native organisation — you're not really reporting return on data investment; you're reporting on valuable business things happening, and the contribution of different groups is usually quite laser-focused.
If you're in a head of data governance, head of analytics, or CDO role and you can't genuinely look in the mirror and say you absolutely understand how a P&L works, how growth works, you'll really struggle to articulate the value you've contributed. There's as much responsibility on the data office to improve their business and commercial acumen as there is on the business to learn about data and AI.
GREG: Some more questions to consider — how are you measuring that more insights are leading to better actions? How are you measuring that better actions are leading to better outcomes?
Data offices are very keen on measuring outputs. Less good at measuring outcomes. The business cares about outcomes — it doesn't care how many dashboards have been created, how many products deployed, or how many people adopted them. It cares about what happened as a result that's measurable. That difference between outputs and outcomes is poorly understood across the industry.
How to build a balanced budget model: your budget model should be based on your current position in maturity and on the things that are going to drive key value. The premise of this session is not to tell you that you shouldn't be spending money on technology — of course you should. But if you've got an 80/10/10 or 70/10/20 type budget, you're probably not going to achieve what you want, and you're not going to evidence the value. And more than anything, you're not going to have an audience that understands the value — because there is a certain level of people, literacy, culture, and change that has to take place for your senior leaders to actually understand what's been delivered.
From a budget checklist: do you have clear business outcomes and owners? Have you verified with your leadership peers that if you deliver this, they agree the budget will have paid itself back and that you'll be responsible for that? Have you planned your skills and culture uplift? Do you have a clear framework for measuring outcomes downstream?
I would genuinely be investing in a value realisation professional if I were a data office. The hardest thing we have right now is to prove that we're valuable. So why are you not spending £50–60k of your multi-million-pound budget on a person to do that for you — getting into the business, engaging widely, and keeping it simple until you know you're going to deliver value?
With four minutes left, does anybody have any questions?
SARAH: Nothing through just yet.
GREG: Okay. If there are no questions, there are no questions.
[Q&A]
RICH: [via Q&A] What's the key thing in AI literacy?
GREG: I'll need more than that, Rich. If I was to answer that question outright as written, I'd say the biggest problem we still face is a lack of understanding around the art of the possible within our people. Most people can't articulate what they want to achieve from AI or get their head around what they could achieve with it — which makes going on a literacy journey quite difficult. We do a lot of framing of use cases and problems that can be solved with AI.
At a foundational level, I'd say people need to understand what could happen if they learned and got this right. At the moment, most people just think they can write a better meeting agenda or summarise a meeting note. But what problems could they solve? That would always be my number one. Happy to take a follow-up if that doesn't quite answer your question.
JASON: A small build on that: it's about how AI will make an individual's world better. It's not really about the AI — it's about what they're trying to achieve because of this capability they've been given.
GREG: Well, we'll give everyone two minutes to get to their next meetings. Thank you so much for joining us. Hopefully it was a useful conversation. I've personally enjoyed it, and there are a few things I'm going to take away from Jason's side of this conversation. We're both on LinkedIn if you want to connect — though your head be it, because you'll get served a lot of content. Thanks all.
JASON: Bye, everyone.
Unlock the power of your data & AI
Speak with us to learn how you can embed org-wide data & AI literacy today.

.png)



.png)
.jpg)
.png)
.png)
.png)
.png)
.png)
.png)
.png)
.png)
.png)