Skip to main content
Back to blog
· 5 min read
The Role Survived, but the Work Moved Elsewhere

The Role Survived, but the Work Moved Elsewhere

Chapter 4 of Reshuffle shows that jobs don't vanish because tasks are automated. They vanish because the architecture of work changes. And that changes everything for leaders.

artificial intelligence future of work leadership books strategy

Continuing my reading of Reshuffle by Sangeet Paul Choudary (first post here, second here), chapter 4 brought an idea that bothered me in a productive way.

Most conversations about AI and the labor market follow the same script. Someone lists the tasks in a role, evaluates which ones AI can handle, and concludes whether the job is “at risk” or “safe.” It’s an intuitive framework. And, according to Choudary, it’s insufficient.

Jobs don’t disappear because tasks disappear

The sentence that opens the chapter is direct: “Jobs don’t disappear because tasks vanish. They disappear because the architecture of work no longer needs them.”

It sounds subtle, but the difference is enormous. Looking at tasks one by one and asking “can AI do this?” ignores the fact that AI doesn’t just restructure individual tasks. It restructures entire systems. And when the system changes, a role can become irrelevant even if none of its tasks have been automated.

Choudary gives an example that stuck with me: a task can resist automation and still be performed by workers. But the job that was built around that task may cease to exist, simply because the organizational context changed.

For leaders, this inverts the logic. The question isn’t “which roles does AI replace on my team?” It’s “does the way I’ve organized these roles still make sense in the system that’s emerging?”

The missing concept: contextual value

What struck me most in this chapter was the idea of contextual value. Choudary argues that the economic value of an activity doesn’t depend only on the skill required or the effort involved. It depends on two things: scarcity and relevance within a system.

Contextual value is the importance of a task in determining system outcomes, regardless of the skill it requires. A task can be difficult, well-executed, and require years of training. But if it’s not central to how the system works, its economic value is low.

This made me think about education. A teacher who masters content and pedagogy has valuable skills. But if the education system reconfigures to the point where content delivery is no longer the bottleneck (because students have access to infinitely patient AI tutors), the contextual value of that activity changes. Not because the teacher got worse. Because the system changed around them.

The double disruption

AI causes two things simultaneously, and that’s why it’s so hard to navigate:

Automation eliminates scarcity. If AI can perform a task, even if people continue doing it, the economic value of that task drops. Because it’s no longer scarce. Think about translation: translators still exist, but the value of a standard translation has plummeted because AI does an acceptable version for free.

Coordination redistributes relevance. AI reorganizes workflows and shifts who holds leverage. Tasks that were central can become peripheral, and tasks nobody valued can become the new choke points where value concentrates.

For leaders, this challenges an assumption we carry without questioning: that if my team becomes more productive, results improve proportionally. Choudary shows that productivity doesn’t guarantee returns. If productivity rises but the scarcity of what you do falls, the market value of your work can decrease even as you deliver more.

Above or below the algorithm

Of all the ideas in the chapter, this distinction stuck with me most.

Operating above the algorithm means shaping the system. Defining the criteria, the parameters, what’s relevant. Deciding how the parts connect.

Operating below the algorithm means being shaped by it. Following rules someone else defined. Reacting to criteria someone else chose. Being optimized (or discarded) by a system you don’t control.

In education, this feels particularly relevant. The pedagogical coordinator who defines how AI is used in the learning experience is operating above the algorithm. The one who receives a ready-made tool and merely implements it is operating below. Both exist, both work. But their position in the system is completely different.

And this applies to any sector. The question for leaders is: am I designing how AI integrates into my system, or am I just adopting what the provider offered me?

Reskilling for what?

The chapter ends with a provocation about reskilling that feels urgent. Everyone talks about reskilling teams. Training people to use AI. Teaching “prompt engineering.” All of that has value.

But reskilling won’t help if the role you’re preparing someone for no longer makes sense within the new logic of work. It’s like training someone to be the world’s best telex operator in 1995.

For leaders, the question isn’t just “does my team know how to use AI?” It’s “will the roles I’m training my team for still exist when the system finishes reorganizing?”

That’s an uncomfortable question. Because the honest answer, often, is “I don’t know.” And dealing with that uncertainty, without freezing and without pretending you know, might be the most important leadership skill of this moment.

What I take from this

Chapter 4 left me with an uncomfortable clarity. It’s not enough to ask the same old questions with a new tool. The questions have changed. What defines value has changed. The way work organizes itself is changing.

For leaders, the invitation is to look beyond tasks and see the system. To understand where contextual value is migrating. To identify whether your team’s activities are gaining or losing relevance in the new configuration. And, most importantly, to decide whether you’re shaping that configuration or merely reacting to it.


If this topic interests you, I’d love to exchange ideas. Find me on LinkedIn.

Join my Newsletter

Thoughts on technology leadership, AI, and education. Straight to the point, no fluff.

No spam. Unsubscribe anytime.