What you'll take away from this article
- Why AI investment is stalling and why it's not a technology problem
- The real reason 95% of enterprise AI pilots never reach production
- Why data & AI literacy are the same conversation
- The five mistakes CDOs make when designing literacy programmes
- How to measure ROI from literacy beyond training hours and course completions
- What a genuinely data and AI literate organisation actually looks, feels and sounds like
- Why the frozen middle is where most transformation programmes quietly die
The tension nobody wants to name
Organisations are spending heavily on AI. New tools, platforms and pilots are being launched every quarter. The investment is accelerating, but this doesn't mean its impacts are measurable, and most leaders know it.
While there are certainly flaws and challenges when it comes to the technology itself, the main gap comes down to people. They're suddenly expected to use it, trust it, and derive value from it but are consistently left behind. They're mostly still untrained, unconvinced and overwhelmed, so the gap between investment and value keeps widening.
This is the tension at the centre of every boardroom conversation about data and AI in 2026: the gap between the ambition of the strategy and the readiness of the workforce to deliver it.
"If you build it, people will get bored. Or not understand. Or be overwhelmed. You can't just build it and hope for the best when 90 to 99% of people don't have that background."
— Jordan Morrow, SVP Data & AI Transformation, Agile One
Jordan Morrow has been in this space since 2016. Often referred to as the godfather of data literacy, he has advised some of the world's most influential organisations on how to close the gap between data capability and human readiness. His view is unambiguous: the tools are not the problem.
Greg Freeman, CEO and Founder of Data Literacy Academy, agrees. His business has spent years working with large enterprises on exactly this challenge: bringing people outside the data and IT function on the journey. The pattern he sees, repeatedly, is the same.
"If people don't understand something, they don't trust it. And if they don't trust it, they're not going to use it. Literacy is the foundation of that. People will understand, then trust, then use."
— Greg Freeman, CEO & Founder, Data Literacy Academy
The numbers tell the story
The scale of the problem is not theoretical. The data is consistent, and it is damning.
- 95% of enterprise AI pilots deliver no measurable P&L impact and never reach production (MIT)
- 46% of AI leaders cite skills gaps as a major adoption barrier (McKinsey)
- 6% of corporations are seeing tangible enterprise-level AI ROI (McKinsey)
- 42% of leaders report significant positive ROI where mature literacy upskilling is in place — versus 22% without it (Industry data)
That 20-percentage-point gap between 22% and 42% is not accidental. It is the direct result of having, or rather not having people who can engage meaningfully with the tools, the data and the decisions being made.
Shadow AI compounds the problem. Workers are using AI without their organisations knowing. This has high potential of becoming a governance failure, a security risk and a sign that your executive data literacy is not where it needs to be.
"Shadow AI is thriving. And that shows me your data governance is struggling, your executive data literacy is struggling, and AI literacy is struggling. People are using AI without you knowing, and that is a governance, security and ethical risk all over the place."
— Jordan Morrow
Why AI Literacy and Data Literacy are the same conversation
There is a dangerous trend emerging in enterprises. AI is being separated from data. It's increasingly being treated as a distinct, shinier priority. CTOs and CIOs are gravitating toward AI because it carries career currency in a way that data no longer does.
The problem is that AI is only as good as the data that feeds it. Organisations that silo AI strategy from data strategy are not accelerating. They are laying the groundwork for expensive failure.
"Your data and AI strategy are your business strategy. They're one and the same. If you silo anything off, good luck. Without good data quality, you're not just missing value. You might eventually be in violation of the EU AI Act or GDPR. And a startup that did it right will have commoditised your position before you've noticed."
— Jordan Morrow
The competencies required are also the same. Whether working with data or AI, the skills that matter are: the ability to read, work with, analyse and communicate. Jordan calls these the three CsL curiosity, creativity and critical thinking. They are as relevant today as they were when he first articulated them. In fact, they are more relevant.
AI is now doing significant cognitive work for people. If your workforce hands over its thinking entirely without the critical faculties to evaluate what comes back, you are weakening the very capabilities that make humans valuable in a data-driven organisation.
Teaching people how to click buttons in Copilot is not a literacy programme. Teaching prompting alone is not a literacy programme. A literacy programme builds the mindset, the judgement and the behaviours that allow people to use data and AI to make better decisions, and to know when not to.
The data confidence problem
Most organisations treat data and AI literacy as a skills gap alone. It is not. The deeper barrier is confidence and mindset.
When you ask a room full of people whether they are data literate, most will say no. But ask those same people whether they check a weather app before travelling, or understand what fuel to put in their car, and they will say yes, of course. Those are data-informed decisions. Most people are already making them without realising it.
The job of a well-designed literacy programme is not to turn every employee into a data analyst. It is to help people recognise that they are already further along the journey than they think and to build from there.
Adults are particularly resistant to this kind of change. They have established ways of working, ingrained habits and a natural instinct to protect what they know. That is why every literacy programme needs a change management spine. Mindset and behaviour change before skill acquisition, so does desire before knowledge.
The organisations that fail are the ones that assume everyone will simply adopt. The ones that succeed treat this as a hearts-and-minds programme and design their strategy accordingly.
The five mistakes CDOs keep making
Data Literacy Academyhas designed and delivered over 100 enterprise literacy programmes. The same failure modes appear, repeatedly.
1. Overestimating the organisation's readiness
The people who come to a CDO are disproportionately those who already see the value. The other 20,000 who didn't knock on your door are a different story. Do not mistake vocal early adopters for enterprise-wide appetite.
2. Believing one size fits all
A marketing team will contain people ranging from deep analytical thinkers to those who see data as irrelevant to their work. Designing one programme for all of them is not efficiency. It is a guarantee of mediocre results across the board.
3. Assuming people want to learn
Some do. Many don't. At least not by default. That is a change management challenge, because your curriculum can be incredible, but without warming people up to the idea of what's in it for them, that won't matter. Inspiring people to grow and learn is a necessity for any cultural change.
4. Trusting the next technology solution to fix the culture
It will not. Giving someone rugby boots does not make them a World Cup contender. A tool without the capability to use it is budget spent, not progress made.
5. Being disconnected from business strategy
If the literacy programme cannot articulate its connection to organisational outcomes, whether that's revenue, risk reduction or operational efficiency, it will be the first thing cut when budgets tighten. The link to corporate and data strategy is not optional, and needs to be thought about from the start.
The Frozen Middle: Where transformation goes to die
Senior leaders sponsor data and AI programmes, and frontline workers are asked to adopt them. But between those two groups sits the middle management layer, and it is here that most transformation efforts quietly stall.
Middle managers are the people who must translate strategic intent into daily behaviour change. They manage the teams who need to adopt new ways of working. And they are also the ones most likely to feel threatened by it. It can result in them being worried about being exposed, outpaced by their own people, or made redundant by the capabilities they are supposed to enable.
"The frozen middle is the most forgotten space in data literacy. They have to answer to executives and manage those going through the programme. To me, that is the secret to success, you have to thaw it out."
— Jordan Morrow
A targeted leadership programme helps turn this essential demographic into your greatest champions. If managers are not on the journey themselves, they will, consciously or not, prevent their teams from making the full extent of progress that's possible. No frontline programme survives a hostile or disengaged middle tier.
What good looks like in practice
So what does a genuinely data and AI literate organisation feel like? Not in theory, but in practice?
The behaviours are the tell. In a literacy-mature organisation, people say 'my hypothesis is' rather than 'I think' or 'I know'. They invite challenge of their analysis. They will consistnetly question the data before acting on it. They become comfortable with uncertainty and are equipped to navigate it. The data and AI office is not seen as a separate team with a separate agenda, or as a ticket-taking help desk. They are a strategic partner in solving the organisation's actual problems.
The metrics shift too. You stop measuring training completions and start measuring the elimination of low-value work. Jordan gives a strong example from one of his customers. He saw how a 70-slide PowerPoint was replaced by six charts visible on a phone. The output was better, and of course the time saved was significant. That is data literacy delivering ROI in the real world.
Data Literacy Academy has documented similar outcomes from its client work:
- A project management team saved 270 hours by applying data thinking to how they tracked and managed resources.
- A £10 million payment risk was avoided because the right people had the skills to spot what had previously been invisible in the data.
These are not learning and development outcomes, but are business outcomes. That is both how literacy programmes should be evaluated and how they should be sold to CFOs and executive sponsors.
"We hate measuring by L&D metrics and training hours. The question is: have we saved time, reduced risk, improved decisions? Have we made the business better?"
— Greg Freeman
There is a structural tension in how organisations approach AI investment. The macroeconomic environment demands in-year returns. But meaningful AI ROI does not arrive in year one. The evidence from Davos and the organisations doing this right points to a one-to-two-year horizon, and that assumes you have built the human capability required to extract it.
If you are a data or AI leader who cannot manage that narrative with your executive team, you are in a difficult position. The answer is not to overpromise. The answer is executive literacy: ensuring the people allocating budget understand the reality of the timeline and the conditions required for returns to materialise.
Organisations that skip literacy and go straight to tooling will spend heavily, see limited results and draw the wrong conclusions. The problem was never the technology. It was always the readiness of the people expected to use it.
Five principles for getting this right
From over 100 enterprise programmes, Data Literacy Academy has distilled five principles that separate the programmes that drive measurable change from the ones that don't.
1. Assess before you build
Baseline your organisation's literacy before you design anything. If 2,000 Copilot licences have been issued and fewer than 100 are active, you need to focus on readiness.
2. Anchor to business strategy
Ideally, every element of your programme should trace back to a strategic priority. The connection between literacy and value must be explicit and visible to senior leadership at all times.
3. Find your executive sponsor and and be realistic
Not every executive will genuinely believe in the mission. Some will say the right things. Find the ones who actually do and build from there. But of course lip service is not sponsorship.
4. Address desire before knowledge
People need a reason to want to change before they will engage with what you are asking them to learn. If you skip this step, behaviour change will still struggle to shift.
5. Design for embedding, not completion
A programme that ends at the training session has already failed. The real work is what happens when people return to their desks and are left to put theory into practice. Design reinforcement into the programme from the start, behaviours, language, habits and peer accountability.
The competitive reality
AI-native startups are being built by small teams, moving quickly and targeting the inefficiencies of legacy enterprises. They do not carry the weight of 20,000 employees who never received a literacy programme and they aren't not waiting for adoption to happen organically.
The gap between what your workforce knows and what is possible with data and AI is widening. Organisations that close that gap deliberately and systematically will outcompete those who don't take this on as a priority.
"Literacy opens possibility. You now have a PhD partner available to you 24/7 in generative AI. But two parts human, two parts data, that formula only works when the human side is ready to engage."
— Jordan Morrow
Data and AI literacy is not an L&D initiative. It is the strategic infrastructure that determines whether your AI investment returns value or disappears into the trough of disillusionment.
The question is not whether your organisation needs it. The question is whether you act on that before the window closes.
GREG FREEMAN
My name is Greg Freeman. I'm the CEO and founder of Data Literacy Academy. I launched the business a few years ago and ever since we've been on this journey of taking large enterprise organisations through the data and AI culture and literacy programme, one that ultimately targets those out there in the business, outside of the data and the IT team. Our specialist audience is: how do we bring people who are not data and AI professionals by trade on the journey? I am very happy and privileged today to be joined by Jordan Morrow. Nobody goes further back in this space of data literacy than Jordan. He is often referred to as the godfather of data literacy, has a huge following on both sides of the Atlantic in terms of his own influence, and speaks and guides some of the most influential organisations in the world on the topic of data and AI literacy. To have had Jordan reach out just before Christmas and say it'd be cool to do some stuff together was a real privilege and the first manifestation of that is this session right now. I'll let Jordan introduce himself and then we'll get into the content.
JORDAN MORROW
Thank you, Greg. I'm Jordan Morrow, owner and founder of Bodhi Data, Senior Vice President of Data and AI Transformation at Agile One, but really just a huge nerd. Seeing what Greg and his team have been doing at the Academy, I'm very selective on who I partner with and I'm very happy to partner with this organisation. I think they're doing the best job on the planet when it comes to data literacy and the empowerment of people. This opportunity is my honour. I've been on this journey since about 2015, 2016. Never did I imagine it would grow the way it has. I'm a published author, my fifth book came out just last week. Super happy to have a fun, nerdy conversation with you today, Greg.
GREG FREEMAN
We're here today to talk about how both data and AI literacy are really the foundation and the underpinning of good enterprise enablement of AI, the very sexy and exciting topic the world is currently obsessed with. In the nineties, everybody started to get their head around the fact that skills like Excel, Word and PowerPoint were skills you needed to stay employable. In 2026, that journey is data, but particularly AI skills. The question we have to ask ourselves is: is our company helping us go on that journey and therefore helping themselves, or are they leaving the people to figure it out for themselves? Jordan, you've been on the data and AI literacy journey since 2015, but you've been around the space of technology and skills for far longer than that. How do you think about the journey of AI skills and the appetite for AI skills when you compare it to something like Excel or the Microsoft stack when digital first became a key thing?
JORDAN MORROW
I've never seen anything like it and that's the truth. There's a balance to it too. There's all this desire to use AI to upskill and bring people along, while at the same time there is legitimate AI fatigue. People are tired of hearing about it, but they know they need to use it. The world of data and AI literacy has never been bigger, because in order for organisations to succeed, they need these skills. But here's the crux of it. In the world of data, all these hyped-up terms, including data literacy, have been seen as magic bullets to solve things. Data science, AI, machine learning, data engineer, data scientist, data literacy. 90 to 99% of people are not data professionals by background. Organisations know they need this. They know they need AI skills. But they really struggle to figure out how to systematise, organise and operationalise all these pieces together. I've never seen anything like it, and it's balanced with people being overwhelmed. That's why I think the Academy is doing such a great job.
GREG FREEMAN
That's such an interesting point around the technology work and the strategy work and the structures. Something we see quite often as a challenge and it's definitely a data and AI problem, is this concept from Field of Dreams: build it and they will come. The data and AI industry, heavily funded by Silicon Valley and lots of VC money, kind of gives the impression that you just build stuff and the people jump on the journey. Why wouldn't they? Is that what you've seen?
JORDAN MORROW
One hundred per cent. And it's sad, because if you build it, people will get bored, or people will not understand, or people will be overwhelmed. You can't just build it and then hope for the best when 90 to 99% of people don't have that background. Especially when how many people purposely choose not to study this in school? Unlike sports, my favourite sport is rugby, and people love sports, so they will come to watch it. If you just throw an AI or a data tool in front of people, they might play around with it and then go back to their old way of doing things, because they don't know what to do. That's where the literacy side comes into play.
GREG FREEMAN
For anybody who's not aware of Gartner's hype cycle concept, hype cycles are flows that new technologies and new ways of working go through. Gartner tracks it based on how hyped up they are at first, when they hit the peak, and then what happens after. I think the strategic position of AI within the hype cycle, especially generative AI, because it's become so mainstream in people's lives, we're already seeing that due to lack of adoption, lack of value realisation, and a lot of money spent without much gained back, it's already starting to slide into the trough of disillusionment. That's where people start to believe they've heard all the hype. Most businesses on this session are going through this right now. They've been promised massive things. Jordan, you obviously speak to a lot of senior executive teams. How are you finding execs are starting to respond to AI now? We're two years in and they've maybe not seen the money coming back from the other side that they expected.
JORDAN MORROW
I think there's a frustration for some leaders, and I also think there are some leaders who still realise this is going to provide a lot of value. What I find really interesting is that it takes data literacy to realise you can't measure the value from AI in old-school metrics. It is very different. This is something we've never seen. The internet democratised information. AI is democratising intelligence and knowledge. That is different, and it's systematising. I actually disagree with the hype cycle a little, because I don't think you can fit AI into the normal hype cycle. You've got people who are frustrated, overwhelmed, and don't know what to do, yet are still throwing money at it. Because they know there's something there. They just don't know how to get there. That's the literacy side. That's the upskilling. When I was in a private meeting with a speaker from Davos, World Economic Forum, she was going over the top ten AI insights from the forum. Number one was AI skills. You're not going to just be able to throw a tool in front of someone and have it work. You have to teach them at the same time. I think while there is some truth to the hype cycle with AI, there's more to it this time, considering how much power AI has and how blasted quickly it's advancing.
GREG FREEMAN
I agree. And although it sounds like we're motivated to say this, I do genuinely believe AI is a material change in the way we work, in the same way the internet was. People say, 'We've heard it before, blockchain was going to change the way we bank.' Those things come and go and fail. But CRM was a hype cycle and it's still here 25 years later. The internet was a hype cycle and it's still here 25 years later. We've got to get our heads around the fact that this is here to stay. But we are dependent on bringing the people on the journey to avoid an entire trough of disillusionment.
It's really interesting, the point you made about not measuring in conventional ways. We had an internal use case this week where one of our salespeople has probably solved a problem we would never have actually got round to solving. You could ask whether that's valuable, because we were never going to solve it anyway, but the way he's used AI to map accounts, map the businesses we work with and their potential value realisation, it was such a big thing, we couldn't solve it with human capital. It was always going to be too big. But he has now solved it, and that's a massive return, even though we can't really quantify it.
JORDAN MORROW
How do you quantify the value of the internet? I talk about the internet a lot. There is a bubble around AI, just look at some of the investment, just like there was a bubble around the internet. When that bubble burst, yeah, it had economic ramifications. The power of the internet didn't go away. Now, can you imagine your life without it? ROI from AI could become exponential if companies do it right. That's the problem they're running into. Number one, they're trying to bucket it. Number two, they don't understand the attribution effect, you can't quantify what solving that problem did, but there is value behind it.
GREG FREEMAN
We truly believe, and we see some brilliant use cases, that if people want to get AI adopted, they firstly have to think about literacy. Secondly, there is no AI without data still. Data quality has been a boring problem for a long time, but if we don't get data quality right, the outcomes we get from AI won't be right. If your people are sat in their day job making decisions on the back of Copilot, or even larger language models that they don't understand, and they don't know how to stand by their decisions because they can't analyse or evaluate them, you've got a massive ethical problem. Most of the blockers you'll find with data and AI adoption are really about people's confidence and their mindset. If you just think about data or AI literacy as a skills problem only, you're absolutely missing the point. The point is: how do we build people's confidence? How do we help them adjust their mindset to wake up every day and think about the problems in their day job that could be solved by data and AI? Most people still do not wake up every day and think that problem they've faced for the last ten years could be solved today with data and AI.
JORDAN MORROW
When you see data and AI literacy in practice, people are confident that they can experiment and fail without being in trouble for it. One of my favourite quotes is from Nelson Mandela: 'I never lose. I either win or learn.' You work on a data project for six months, your mindset is: I might not get it right, but I can learn so much along the way. And as a leader, I'm still going to reward you with a bonus. It's curiosity. You question everything and you allow your work to be questioned. Those three Cs: curiosity, creativity, critical thinking, combined with that mindset: that's what a data and AI literate culture looks like. Psychological safety is one of the key ways you find data literacy and AI literacy succeeding. It's the behaviour of the people that tells me if a company is doing this right or not.
GREG FREEMAN
When I speak to people about data and particularly AI literacy, the human-in-the-loop concept is very prevalent. Are we actually still able to have employees who think for themselves and can critique what's coming out the other side of the tools they're using? Those three Cs become even more vital in a world of accelerated technology performance, where a lot of the technical work can now be done for us. If you just teach people buttonology and call that a data literacy or AI literacy programme, you are letting your people and yourself down. The buttons don't teach you how to use the thing. It's the critical skills behind it, the mindset, the behaviours, the logic, the first principles. A good literacy programme isn't just giving people Copilot training.
JORDAN MORROW
Can I piggyback on that? When I think about why I created this in the first place, I created, while working at American Express, a plan to teach people how to use basic statistics and analytics, because all we were doing was training people to click buttons. Training someone how to use Tableau, Power BI, Sigma — all these tools, is not data literacy. It's tool literacy. That doesn't mean you know how to use the data to make a decision. American Express shut me down and said they weren't ready for it. I'm glad they shut me down, because then I got hired by Qlik, got to be an entrepreneur and build the whole thing. A few years later, guess who came calling for data literacy help? American Express. Don't just teach buttons. Don't just teach prompts. Teach people how to use data and AI to make their life better, and they will use it to make their life better.
GREG FREEMAN
The principles of data quality, the principles of the data value chain, the principles of knowing that there's a continuous loop where things are being fixed and improved, these still exist. Something I'm seeing, and I'd love your view on this: data and AI are all of a sudden being separated in terms of how people are thinking about them in businesses. CTOs and CIOs are gravitating toward AI because it's something they can build a career off, if they hit a jackpot, it'll be on their CV. Data stopped being that about ten years ago. It was supposed to be the oil that fed people's careers. Now it's just a headache that nobody wants to own. That's driving the separation where data teams and CDOs are becoming almost like the poor younger sister of AI. If this isn't part of your thinking with data quality and data governance, they're never going to use AI responsibly if they don't use data responsibly. Are you seeing the same thing?
JORDAN MORROW
I think we'll see that in perpetuity, and it's human nature. People are buying into the hype and trying to protect themselves by saying, 'I get to own the AI side of this.' But AI is only going to be as good as the data that feeds it. The moment you silo that off, in reality, your data and AI strategy are your business strategy. They're one and the same. If you silo anything off, good luck to you. Without good data quality, you're potentially in massive trouble. Not just because you're not going to get value. You might eventually go in violation of the EU AI Act, the Colorado Act, GDPR, and data privacy. A startup over here did it right, they've commoditised it, made it easier. There's just no reason to do it wrong.
GREG FREEMAN
First: I see this every day, a lot of people overestimate the data literacy of their company, and honestly even overestimate the appetite of their company. The people who come to you as CDO are disproportionately those who already see the value. That doesn't mean the other 20,000 do. Second: believing a one-size-fits-all approach will work. It won't. You can't give everybody the same training, even in the same job role. Go into a marketing team, some of them could be the most analytical people in the world; some could think it has no value to them. Third: assuming that everyone wants to learn about data and AI, definitely not the case. This has to be a change management piece. If you just think about this as a skills programme, you will lose. This is about winning hearts and minds, changing mindset and behaviour. Fourth: trusting that the next tech solution will fix the culture. It isn't a thing. The tech will not fix the people. Fifth: being too disconnected from corporate strategy. We're getting better at repeating 'make it part of the strategy', but actually connecting that dot between strategy and value is still a massive gap in the industry.
A lot of data and AI literacy programmes obsess over the stuff at the top of the curve, how do we get people to self-serve? How do we give people all the skills we need? Actually, most of your organisation, and Jordan's point about 99% of the population not being data or AI professionals, most people in your business won't even really understand why they should be using it. Why is it valuable? What problems can it help me solve that I don't know about yet? How do I think about the things the business tells me I should care about, like using the right tool? If we obsess over everything at the top, we leave so much of the population behind. We really need to think about what our programme is aiming to deliver. Is it only raising the ceiling? If 80% of the population doesn't have the right confidence or mindset and you miss all of that, all of a sudden you've let the population of your business down, and your adoption will never get over 15 or 20%. You've not brought the more difficult people on the journey.
The figure that gets batted around the most: 95% of enterprise AI pilots deliver no measurable P&L impact and never get to production, according to MIT. 46% of AI leaders cite skills gaps as a major adoption barrier, from McKinsey. Around 6% of corporations are seeing tangible enterprise-level AI ROI, from a similar McKinsey report. These things don't come together well when only a subset of the population is able to engage with AI programmes. At the bottom: 22% of leaders report significant positive ROI, rising to 42% where mature literacy upskilling is in place. That 20% difference comes from having the people who can help you extract the value.
JORDAN MORROW
Yes, and the MIT report illustrated to me that there is a true literacy gap, because when you read that metric you think they all failed. But shadow AI is thriving. And that shows me your data governance is struggling, your executive data literacy is struggling, and AI literacy is struggling, because you have people using AI without you knowing. That is a governance, security and ethical risk all over the place. Companies are feeling the pain of putting all this money behind things and asking: why am I not seeing anything? Putting money behind it alone does not change things. People and behaviour change things. If you want that 42%, you have to be monitoring not just usage, but change in behaviour. Change management is a piece of it. Executive leadership is a piece of it. And by the way, from that Davos conversation I had, it is a one-to-two-year ROI horizon on AI. Companies want it tomorrow. So there is a real tension in that literacy side.
GREG FREEMAN
It's such a difficult situation. The macroeconomic environment requires in-year returns, but so much money is being spent on something that shouldn't necessarily give you material in-year returns. It's going to be year two, year three, year four, when you really get your head around it as an organisation and find the right use cases, that it will drive immense ROI. But if you as a data or AI leader can't control the narrative around how quickly ROI should be received, you are really going to struggle. That comes down to having a literate exec. If people think data and AI can be solved tomorrow, you've got a real problem. If you educate them to understand that can't be the case, that things are slow deliberately, especially in regulated environments, you've got to bring them on the journey as well. It really is a top-down, bottom-up approach.
We talk a lot about the frozen middle, those middle managers whose world is easy on a day-to-day basis and who really don't want their world to change. We have to hit them with a targeted leadership programme. If you don't unlock the frozen middle, nobody on the front line is going to be allowed to do it. We came across a leadership group recently who were keen to move their own AI literacy forward, but were at the same time complaining that their people were doing their work using AI. It was like: that is what we're selling you here. The plan is to encourage and enable them, not to be uncomfortable with it. You need all the layers coming together in the right way.
JORDAN MORROW
The frozen middle is the most forgotten space in data literacy. I find they have the most important job, because they have to answer to executives and manage the people going through the programme. They have to have their own literacy that combines with the larger literacy effort. That is the secret to success, the frozen middle. You've got to thaw it out.
GREG FREEMAN
First: establish and assess before you build. Whether you do an enterprise baseline assessment or simply identify where you've got gaps, if you know your adoption of AI is low and can quantify it, for example 2,000 Copilot licences issued and only ten used, you've got a problem. You need those baseline measures. Second: anchor to the strategy always. Find your executive sponsor, you're not going to win the whole exec. It would be career suicide for an exec to say data and AI are not valuable. But genuinely believing it and articulating how the world looks different, that's a different thing. Find the people who really see it. Third: address desire before knowledge. Fourth: choose where you go first, your first cohort needs to prove value, evidence success and prove adoption. Don't just go horizontal. Think about where you go first and then design the change management programme around it. Fifth: design for embedding, not just learning. What we really need is people who know how to embed this into their work afterwards, that's where reinforcement comes from.
JORDAN MORROW
I love your pyramid of confidence and mindset. I think mindset is one of the most important words here. When you see data and AI literacy in practice, people are confident that they can experiment and fail and not be in trouble for it. It's the curiosity, you question everything and you allow your work to be questioned. You should be welcoming challenge left and right. It doesn't mean it's wrong. We just have to be open to other possibilities. Those three Cs and that mindset, when combined, that's a work in progress, but one that is succeeding. And Alli Hartman put it in the chat, psychological safety is one of the key ways you find data literacy and AI literacy succeeding. It's the behaviour of the people that tells me if a company is doing this right or not.
GREG FREEMAN
And a few other things. People see the data and AI office as a partner, not as another team. It's not their problem, it's our problem together. That's not from a boring data ownership perspective, it's: how can we solve problems together as peers? And even casual language things. In our business, disproportionately compared to large enterprises, people will say 'my hypothesis is this', not 'I think this', not 'I know this', not 'my experience tells me'. My hypothesis is this. We need to look at it. We need to prove it. Those are things you'll see and feel within a culture.
There is a massive data literacy ROI piece here, and this is how we want to measure it. We hate measuring by L&D metrics and training hours. A project management team we work with saved 270 hours because they became more efficient in how they managed their projects. They missed fewer deadlines. They were able to track everything because they got into the data and understood that project management is a data problem, it's how you move resources and get them well aligned. A ten-million-pound payment risk was avoided. There are mistakes being made in businesses every day that are underpinned by data that people just don't know to observe or care to observe. Those things are absolutely huge in terms of how we avoid making mistakes that cost a business money.
Looking ahead two or three years, what does the role of a CDAO look like if they get this right? I think if I was a CDO, I would be putting investment into this space because it makes me more likely to keep my job. Having a data and AI literate workforce means they understand my value and what we're trying to achieve. What do you think it looks like for them, Jordan?
JORDAN MORROW
For me, what does the organisation look like in two to three years? It is a continual evolution, shifting understanding and kicking projects to the kerb when necessary. One of the top ways to measure if data literacy is working: how much work have we eliminated? Most organisations are doing more work right now than they need to, and it's not even close. Have we taken a dashboard from 20 tabs down to two? I had a PowerPoint that was over 70 slides long. I broke it to six charts viewable on a phone. That's data literacy. I'm looking at organisations where AI and data is now native to everything you do, not a feature on the side. How much work have you given up? How much have you streamlined? Those are ways to look at it that aren't just: you checked a box and took the course. It's an actual transformation that works.
GREG FREEMAN
We had a conversation the other day with a business, I won't name them, whose exec receives a 154-page PowerPoint deck on a weekly basis. I said: what a good data-literate workforce looks like is that deck becoming ten slides, not 154.
JORDAN MORROW
I stopped sending that kind of report and just sent six charts viewable on a phone. The smartest person I ever worked with at American Express, the Chief Risk Officer, received that report. Guess how many people asked me for it? Zero. They didn't need it. It showed all that work being done. They just needed six charts with descriptive and predictive analytics. That's all they needed. If you're building something that's 150 slides long, please go talk to your leader and say: can I redo this?
Literacy opens possibility, not only does the art of the possible open up when you partner with data and AI more, but you now have a PhD partner available to you 24/7 in generative AI. My formula: engineered intelligence is data plus AI plus IQ plus EQ. Two parts human, two parts data, it comes to life when you use these things for the right reason. And one of the realisms around this: being more data and AI literate makes you more marketable in a world where jobs are being affected. We've seen it recently, companies letting go significant portions of their workforce because of AI. How do we make ourselves more tangible to our companies? We do it through literacy.
GREG FREEMAN
I agree. Your career will be enabled by being data literate, and that's a really important thing. We haven't got time for all the questions unfortunately. Any questions that haven't been answered, we'll get them answered offline and send you a connection on LinkedIn with an answer, because we don't want to ignore them. We've just released a new AI literacy analysis model, if you're thinking about how to know whether your organisation is AI literate, follow the link in the chat. Thank you so much to Jordan. We are looking forward to doing more of these. I've loved the energy of the conversation.
JORDAN MORROW
That is a brilliant thing. Thank you to everyone who's joined us. I will see you all very soon.
Unlock the power of your data & AI
Speak with us to learn how you can embed org-wide data & AI literacy today.




.png)
.jpg)
.png)
.png)
.png)
.png)
.png)
.png)


.png)