Can AI clone your best people, finally “scaling expertise” like software? It’s a seductive question, but not quite the right way to think about the issue.
The marketing promise is simple: extract what the veterans know, package it into an algorithm, and suddenly every junior performs at a senior level, if a junior is even needed at all.
The standard objection to this is that most expertise is tacit. A senior lawyer doesn’t just know case law, but also which arguments are likely to land. However, when an AI ingests not just case law but thousands of transcripts, decisions, and outcomes, the expertise isn’t pulled from a single head; it’s reconstructed from the data exhaust of many.
This creates economic shifts that are hard to ignore. LLMs draft contracts that often need less revision than junior work. Radiology models catch anomalies by seeing more scans than any human ever could. Code tools lift competent programmers by automating recurring patterns. These are productivity gains large enough to show up in earnings.
But that’s only half the story. Systems trained on expert behavior continue to struggle with genuinely novel cases. They excel within their training distribution and fail spectacularly outside it—often without any signaling. AI handles the eighty percent that follows patterns, leaving human experts for the twenty percent that requires real judgment.
That isn’t exactly “scaling” expertise, but rather unbundling it. It shifts the cost curve by decomposing work into automatable parts and irreducibly human parts and reassigning them.
This creates winners and losers. Experts whose value was pattern recognition over familiar cases face automation. Those who handle exceptions, synthesize across domains, or decide with incomplete information become more valuable because failures escalate to them. Routine expert work trends toward commodity pricing; judgment in ambiguity earns a premium.
The key to success, therefore, isn’t to think about replacing expertise with AI, but to redesign work around the new cost curve. Yes, that means serious investment, but companies pouring money into AI while gutting training and mentorship build fragility—automation atop eroding capability. When edge cases arrive (and they will!) no one is left to think from first principles.
Organizations that use AI to accelerate learning, however—an always-available sparring partner and error-checker—may develop expertise faster than ever.
Expertise is being unbundled. Some of it will live in algorithms (more than skeptics admit, less than consultants promise). Perfect extraction is a fantasy. The reality is messier and, done well, more valuable.
The ultimate question isn't whether AI can scale your expertise, but whether it will atrophy it or accelerate it. And that choice comes down to whether you're building an organization that can still think. Three years from now, will your experts be solving harder problems than today—or just managing more automation?