The analogy that should worry every leader
Ask someone under 30 to navigate from one side of a city to another without their phone and watch what happens. It's not that they're less intelligent than previous generations — it's that GPS has meant they never needed to develop the spatial reasoning, landmark awareness, and mental mapping that older generations built through years of navigating without assistance.
This is the GPS problem: a tool so useful that it doesn't just replace a task — it prevents the development of the underlying capability. And it's exactly what AI is doing, right now, across almost every domain of knowledge work.
The GPS analogy understates the risk. GPS eroded one skill — navigation. AI is touching almost every knowledge work skill simultaneously: writing, analysis, coding, decision-making, research, legal reasoning, medical diagnosis. The scale is without precedent.
Where capability erosion is already happening
This isn't a future risk — it's happening now, in organisations that have moved quickly to integrate AI into their workflows:
| Domain | AI tool | Capability at risk |
|---|---|---|
| Writing | AI drafting assistants | Ability to structure arguments, articulate ideas, find the right words under pressure |
| Data analysis | AI analytics tools | Pattern recognition intuition, ability to spot anomalies, statistical reasoning |
| Software development | AI coding assistants | Deep understanding of how code works, ability to debug without AI, architectural judgment |
| Decision-making | AI recommendation engines | Judgment developed through wrestling with hard decisions, risk intuition |
| Research | AI summarisation tools | Ability to evaluate sources, synthesise conflicting information, form original views |
| Customer service | AI support agents | Empathy, de-escalation, handling genuinely novel situations |
Why this is a business continuity problem, not just an HR one
Most organisations frame skill erosion as a talent development concern — something for HR and L&D to worry about. But it is also a fundamental business continuity risk.
Consider: if your AI-assisted document processing system goes down, and the team that previously handled it manually has forgotten how to do so — or been replaced by a smaller team that never learned — you don't just have a technology outage. You have an operational capability gap that cannot be resolved quickly.
The military understood this long before AI. They continue to train soldiers in map-reading despite universal GPS availability. They continue to train pilots in manual flight despite sophisticated autopilot systems. The reasoning is simple: when the technology fails, the human needs to be capable of taking over.
What smart organisations are doing about it
The answer isn't to slow AI adoption — the competitive cost of that is too high. The answer is to adopt AI in a way that augments rather than replaces human capability. Organisations getting this right are doing several things:
- Rotating new staff through manual workflows before introducing AI assistance, so foundational skills are built first
- Running quarterly "manual mode" drills — treating it like a fire drill, normalising the idea that AI won't always be available
- Requiring human explanation of AI outputs — if a team member cannot explain why an AI recommendation makes sense, they're not allowed to act on it
- Preserving documentation of manual processes rather than retiring them when AI takes over
- Hiring for domain expertise alongside AI proficiency, not instead of it
- Measuring human capability independently of AI-assisted performance metrics
The deeper question for leaders
Every leader adopting AI should be asking two questions, not one. Not just "how do we get the most from AI?" but also "what human capabilities are we at risk of losing, and how do we protect them?"
The organisations that answer both questions will have the best of both worlds: AI-accelerated productivity and the human capability to function, adapt, and make sound judgements when AI isn't available or isn't right.
The organisations that only ask the first question are building a dependency that, sooner or later, will cost them.
Protect your human capabilities while adopting AI
Our free guide includes a section on maintaining human fallback capability — the practical steps organisations can take to ensure AI augments rather than erodes their people's expertise.
Download Free Guide →