In the AI-disrupted world of work, is knowledge still power?

For 60 years, careers followed a simple logic: master something hard, become known for it, and trade on that expertise. When Peter Drucker coined the term "knowledge worker" in 1959, knowledge was a valuable social currency. It was hard to obtain, the path to credentials exclusive, expertise rarely substitutable. Social psychologists French and Raven listed "expert power" as a key social influence to leverage. Knowledge made you useful, and indeed powerful if you had more than others.

Now that logic is unravelling. AI can generate, summarise, and apply knowledge at scale. When anyone can create a credible strategy in minutes, expert power loses its edge one prompt at a time.

The response, so far, has been to learn faster. Generative AI is now Coursera's fastest-growing skill category, with 14 enrolments every minute recorded in 2025. Most of these courses teach tools, prompting, and the basics of machine learning. The logic is familiar: if the ground shifts, stack more credentials.

This is yesterday's answer to today's question.

The consensus correction is almost as stale. As machines advance, the story goes, human qualities become more valuable: judgement, emotional intelligence, sensemaking. True, but incomplete. It treats the problem as a skills gap. An old solution to a new problem.

Authority in organisations was never simply held. It was granted. Subordinates and peers decided, often unconsciously, whose analysis to trust, whose recommendation to follow, and whose framing of a problem to adopt. Expert power worked in part because the granting side of the equation had limited means to substitute the expert's contribution. That constraint is gone. The junior analyst can now pressure-test a partner's reasoning in real time. The board member can interrogate a CEO's strategy using the same models the CEO used to write it. The information asymmetry that quietly underwrote professional authority has collapsed.

Not only that, the job itself has changed shape. Leading people with ambitions, fears, and loyalties is not the same as orchestrating a fleet of AI agents, and leaders are now doing both at once. Meanwhile, the older game has not gone away: colleagues still compete for budget, headcount, and the ear of the senior leadership team, only now with machine-generated arguments on both sides of the table.

This is the harder problem. How do you hold authority in a room where everyone has the same answers? Authority isn't about being the most informed; that person no longer exists. Instead, authority is what you do with information: the questions you ask, judgments you make visible, the trade-offs you articulate, and the conviction you carry when the model is uncertain and a decision is needed.

AI fluency will become only a baseline. Leadership skills are necessary but not enough. The true differentiator will be the ability to earn authority in a landscape where it is no longer assumed. The successful will not be those who recognise that credibility's foundation has shifted and who rebuild it for the new era.