The Great Erosion
On the fine line between AI augmentation and cognitive atrophy
I’m neck deep in helping a few exceptionally large organisations work through their AI strategy and rollout right now. And I mean everything. Tying AI acceleration to their broader business strategy. Looking at partnerships and tool adoption. Efficacy measures. Impact on recruitment. The full machinery of institutional transformation, all the way down to how you measure whether any of this is actually working.
And one of the big things I’m really starting to worry about is talent. Specifically, the tightrope walk between cognitive augmentation and cognitive atrophy. Because I’m watching this play out in real time, and I’m not sure we’re getting the balance right. Actually, I’m fairly certain we’re not.
Right now, it’s really easy for us to just constantly say AI will make you better. AI will give you superpowers. AI will do the grunt work in your day-to-day job to free you up for higher-order thinking. We’ve become evangelical about it. In our rush to augment everybody, we’re pushing out Copilot licenses like they’re going out of fashion. We’re getting people to sit through multiple AI training programs. We’re baking it into annual reviews. We’re baking it into the recruitment process. Are you proficient with AI? Can you prove you’ve done something amazing to improve efficiency or effectiveness in some way? Show us the receipts.
The messaging is relentless and it’s uni-directional. Use AI. Use more AI. Use AI faster. Be an AI-first organisation. Don’t get left behind.
But what we’re not talking about, what we’re not worrying about nearly enough, is the other side of the coin. If we’re running at speed at this, if we’re pushing everyone to use AI because we think it’s some sort of panacea to every business challenge we have, what about atrophy? What about the erosion of expertise, of knowledge, of critical thinking, of the kind of deep collaborative thinking that actually produces breakthrough work?
And how do we guard against it when we’re simultaneously telling everyone that using AI is the measure of their competence?
Because I think we are not sleepwalking into a potential catastrophe here. We’re sleep-running. And it’s far bigger than job losses.
Everyone is obsessed with job displacement. When will the robots come for our roles. How many redundancies. Which functions get automated first. And I get it, that’s a real concern. But the timeline on that is actually quite long. The vast majority of AI introduction and integration across big bureaucratic organisations is going to take a while. Those efficiency improvements and the reduced need for recruitment are not going to manifest as quickly as we think they are. It’s going to be a much slower process than the hype suggests. Procurement cycles, integration challenges, change management, legacy systems that don’t play nicely with new tools. The machinery of institutional change grinds at its usual glacial pace.
But cognitive atrophy - that can happen instantly. Someone can start reflexively outsourcing their thinking today, and the degradation begins immediately. No approval process required. No implementation timeline. No procurement cycle. Just a quiet erosion of capability, person by person, decision by decision. And it compounds. Every month of outsourced thinking makes it harder to do the thinking yourself. The muscles weaken. The instincts dull.
That’s the dangerous asymmetry nobody is talking about. The organisational benefits are slow. The individual damage is fast. And we’re treating this like the timeline works the other way around.
For advertising and creative industries, this risk is especially acute, and it terrifies me. The thing being atrophied is creative thinking, conceptual originality, the ability to make unexpected connections. This is both your core product and notoriously difficult to rebuild once lost. You can’t just retrain someone back into having a creative eye after two years of prompt-dependent work. I’ve seen people try. It doesn’t work like that.
These capabilities are built through struggle. Through thousands of hours of wrestling with ambiguity. Through the compound learning that comes from doing the hard thinking yourself. Through sitting with a brief for three days and letting your subconscious chew on it before you touch a keyboard. Through the boredom and frustration and false starts that eventually crystallise into something genuinely new.
We are, first and foremost, a creative and talent-led industry. That’s what we do. That’s the only thing we do. There is no creativity without the humans in the building. And yet we’re racing to offload the very activities that build and maintain creative capability.
What’s most insidious about this is that early-stage atrophy feels like productivity. The person thinks they’re getting so much done. Look at all this output. Look how fast I can turn things around. Meanwhile, they’re actually hollowing out the very capabilities that made them valuable in the first place. The output looks good. The velocity is high. The decline is invisible.
By the time organisations notice that all their creative work feels samey, that it lacks the spark of genuine originality, that it’s all starting to blur together into a kind of competent blandness, they’ve potentially lost years of skill development across entire cohorts. And you can’t get that back. You can’t restore experience from backup.
We’ve been talking a lot about AI giving you superpowers. AI augmenting innate talent. AI helping you do more of what you could already do but better and bigger and more expansive. Freeing up your time for higher-order thinking. And all of that is true. I believe in that possibility. I’ve seen it work beautifully when done right.
But the other side of the coin is atrophy. And I don’t think we’re talking enough about that balance. I don’t think we’re even acknowledging that there is a balance to strike.
There’s a core tension here that we’re getting completely backwards. AI should amplify the twenty percent of your thinking that’s genuinely novel, freeing you from the eighty percent that’s repetitive. The stuff that doesn’t require original thought. The formatting, the restructuring, the administrative overhead. But in practice, I’m watching people outsource the twenty percent, the hard conceptual work, and keep the eighty percent, the execution and formatting.
This is backwards. And it’s catastrophic.
Because the twenty percent is where you build the muscles. The eighty percent is just exercise. You’re supposed to offload the exercise and do the heavy lifting yourself. Not the other way around.
And then there’s the junior talent problem, which is maybe the scariest part of all of this. If you’re twenty-three and entering creative work right now, you could theoretically go your entire early career without ever building the foundational cognitive muscles that create senior creative excellence. Pattern recognition. Conceptual synthesis. The ability to tolerate ambiguity and sit with discomfort. The instinct for when something is genuinely new versus when it just looks shiny.
These aren’t things you can teach in a workshop. They’re built through years of doing the work. Years of making mistakes and learning from them. Years of having to figure things out the hard way because there was no shortcut available.
Five years from now, who mentors the next generation if the current cohort never developed those muscles themselves? We’re not just risking individual capability here. We’re risking the entire knowledge transfer chain that creative industries depend on. The apprenticeship model breaks down completely if the masters never actually mastered anything. If they just got very good at prompting.
This keeps me up at night. Are there people in organisations right now who have already atrophied significantly? And if so, how would you even know? The work might still look fine. AI is good enough to mask decline for quite a while. The presentations are polished. The copy is clean. The concepts are fine.
But fine isn’t what built your reputation. And fine won’t differentiate you in an increasingly commoditised market where everyone has access to the same tools. Fine is the sound of a creative industry eating itself from the inside out while congratulating itself on its efficiency gains.
So what do we actually do about this? Because I’m not saying don’t use AI. That would be stupid. The genie is out of the bottle and there are genuinely transformative applications of this technology. I’m saying we need to be incredibly deliberate about how we integrate it. We need to slow down and think about what we’re optimising for.
This is a tightrope walk where you can’t see the other side yet, the rope is swaying, and everyone’s being encouraged to run. The move-fast ethos that works for software deployment is potentially catastrophic when applied to human cognitive development. You can’t A/B test your way out of atrophy. You can’t pivot when you realise you’ve lost a generation of expertise.
This requires incredibly careful rollout. Incredibly careful monitoring. Clear guidelines that distinguish augmentation from abdication. It requires honest conversations. Are we optimising for short-term velocity or long-term capability? For individual productivity or organisational resilience? For looking like we’re doing something innovative or actually building something sustainable?
It requires checks and balances to monitor potential red flags for atrophy versus augmentation. And it requires the courage to slow down when everyone else is speeding up. To say no, we’re not going to mandate AI usage. We’re not going to tie it to performance reviews. We’re not going to make it the primary measure of competence. Because that creates the wrong incentives entirely.
We need to think about struggle quotas. Deliberately working without AI on some portion of creative work, the way athletes train at altitude. If you never struggle, you atrophy. That’s not motivational nonsense, that’s neuroscience. We need to think about protecting certain cognitive domains as AI-free zones where people maintain raw capability. Not because we’re Luddites, but because we understand how skills are built and maintained.
We need to think about what it means to hire and develop junior talent in this environment, when the temptation to shortcut the learning process is immediate and constant. How do you teach someone to think creatively when they have a button that does the thinking for them? How do you build resilience and problem-solving ability when the path of least resistance is always to outsource the problem?
We need to start asking different questions. Not just “are we using AI” but “are we using it in ways that make us stronger or weaker in the long run.” Not just “is the output good” but “could we still produce this output if the AI disappeared tomorrow.” Not just “are we being efficient” but “are we building the compound expertise that will matter in five years when everyone else has the same tools.”
Because right now, in the race to do AI anything, we risk losing everything that made us worth working with in the first place. The efficiency gains will come slowly. The talent drain is happening now, in real time, in ways that won’t show up on a dashboard until it’s far too late to course correct.
Count me out of any strategy that doesn’t reckon with both sides of this equation.



Totally agree and wrote something similar myself. Something i have come to call 'The Great Rot'
https://willposkett.substack.com/p/the-great-rot-howai-kills-your-edge
Greating writing as always zoe
Brilliant take as usual, thank you