AI

I hear a lot of fearful AI takes from people I know, all centred on the same theme: being wholesale replaced by AI. I think the root of this anxiety comes from two places. First, not understanding how the latest AI works. Second, viewing yourself as an indivisible bundle: you plus your current job. The first fear can’t be assuaged in a short essay, but the second definitely can.

AI tackles big problems through unbundling. For coding, it divides work into specialized subtasks and then spins up agents dedicated to them, like architecture, implementation, testing, and iteration. Since Everything Is A Codebase now, it will look at you and your job the same way, and it will be correct. Your job is actually a bundle of tasks you complete with a bundle of skills. AI is already breaking it all down and reconstituting new wholes. And it didn’t invent this paradigm. It’s learning from history.

Until the 1970s large firms had dedicated pools of typists. Executives would handwrite memos and send them to the pool to get prioritized and transcribed onto carbon paper. When personal computers arrived, typists got unbundled. Typing as a skill was transferred to the executives, eliminating the lag between idea and execution. The former typists had to combine their now-trivial typing skill with domain knowledge to make new bundles. The successful ones got recomposed into administrative assistants, paralegals and technical writers. The rest had to look elsewhere.

Once you’ve unbundled your skills, you need a way to evaluate their AI-resiliency. Here’s a useful heuristic: are you engaging data or humans? Engaging data implies days spent living in spreadsheets and databases. It implies analyzing, optimizing and forecasting. This is not AI-resilient. When these skills get unbundled, they don’t end up back in the hands of humans. They’re offloaded to agents, so humans can focus on engaging humans. Engaging humans implies living in Teams calls, boardrooms, and luncheons. It implies negotiating, mediating, and selling.

Success will mean rebundling yourself around engaging humans. That may mean learning the dark arts of navigating the biases, insecurities, and principles of yourself and others. It will definitely mean owning the surface area of legal, regulatory and reputational risk of an exponentially growing terrarium of agents.

It may seem bleak that humans are becoming less efficient than AI. That the number of skills we have to competitively offer are shrinking. That fear mistakenly assumes there’s a fixed amount of demand for every skill.

Geoffrey Hinton, the ‘godfather of AI’, famously said in 2016 that people should stop training radiologists. Now, in 2026, there are more radiologists than ever. It turns out demand for radiology outstripped the previously constrained supply. Radiologists simply do more work than before as a result of AI. AI can run analysis of x-rays, CT scans and MRIs faster than humans, thus relieving a bottleneck. However, it turns out we wanted to run way more of those scans when their marginal cost went down. And each scan requires a smaller amount of human work, such as patient interaction and reviewing edge cases. How many more reps of your AI-resilient skills could you do if you were unburdened?

But what if your AI-resilient skills are common? Won’t you get beat by top performers as competition intensifies and the available arenas dwindle? Not if you rebundle skilfully. Here’s where the math works in your favour:

Scott Adams, author of the Dilbert comic strip, claimed he owed his success to ‘skill stacking.’ He rated himself as merely in the top 25% of the world’s cartoonists, comedians, and businessmen. On their own, those ratings are unremarkable. But by putting those three together, he was in the top 1% of people at the intersection. By collecting two or more AI-resilient skills, you can quickly become top 1% of your unique domain knowledge.

So now you have:

  1. A framework to understand AI’s impact on your job (unbundling)
  2. A heuristic to find your AI-resilient skills (engage humans)
  3. A formula to combine those skills into a moat (skill stacking)

All of these hinge on you running towards the problem, not away from it. I must insist, don’t underestimate AI. Don’t assume the past performance of AI predicts the future. It’s getting better every day. There will be places you can run to. There will be jobs that feel safe because AI hasn’t touched them yet. But these will be akin to an overhang of snow: if there’s no structural reason humans add value, it will collapse into an avalanche in your lifetime.

Start using AI to make efficiencies in your own personal life first. Unbundle your tasks and skills and find the use case that’ll save you 5 minutes per week. Then brag about your wins online. This will be the first rep you need to start thinking like your employer wants its AI zealots to think. The work disruption won’t come from your AI team. It’ll come from people with the domain knowledge and courage to disrupt. Just don’t be quiet about it. You have the once-in-a-lifetime chance to make the playbook for how AI will disrupt your job, but don’t do it haphazardly. Unbundle your job and your skillset to match yourself to a structural moat, then go all in on accelerating it.