LinkedIn & Anthropic killed specialist roles. Are learning design roles next?
AI is collapsing the L&D workflow — and creating a new kind of "full stack LXD" who owns the entire lifecycle
Hey folks 👋
In December 2025, LinkedIn’s Chief Product Officer Tomer Cohen scrapped the company’s Associate Product Manager programme. In its place, he launched something called the Associate Product Builder programme — a track that teaches coding, design, and product management together instead of treating them as separate specialisms.

Cohen’s reasoning is blunt: splitting responsibilities across specialists with constant handoffs had slowed product development to a crawl. LinkedIn didn’t need faster specialists: it needed people who could take a product from idea to launch themselves.
This shift is contested in product circles — there are legitimate concerns about depth, about design quality dropping when everyone “does everything,” and about whether the model works outside frontier tech companies.
But whatever the objections, the direction is clear, and it’s not confined to product management. At Anthropic, Boris Cherny — creator of Claude Code — predicted that “software engineer” as a job title would start fading in 2026. Leaders at Anthropic expect AI to write around 90% of code in the near term, and the majority of new code is already AI-generated. The company now hires mostly generalists, not specialists — because, as Cherny puts it, “the model can fill in the details.”

The pattern is clear: AI is collapsing specialist work, expanding what one person can do, and shifting the premium from niche expertise to end-to-end capability.
The same pattern hasn’t yet fully emerged in learning design — but there are signals to suggest that it is on its way.
In this week’s post, I explore where learning design is likely headed, why, and what this means for your skills, your tools, and your job description.
Why the Traditional Model Is Breaking
If you work in L&D, you probably recognise this workflow: your team is organised around phases of workflow. There are people who focus on analysis, design people, development people (often further split by tool — the video person, the eLearning person, the LMS person), delivery people, evaluation people.
Each specialist goes deep in one slice of the workflow and one set of tools. Handoffs are constant: a single course might pass through five or six pairs of hands before it reaches a learner. Updating content often means effectively rebuilding it.
This model made sense when production was genuinely hard and required years of craft mastery. Thanks to AI, three pressures are converging to break it:
First, the pace of knowledge change now exceeds the pace of course production. Policies, products, and regulations shift faster than specialist teams can respond. By the time a course clears its handoff chain, the source material has moved on.
Second, organisations are asking L&D to do more — more programmes, more audiences, more modalities, more languages — without proportional headcount. The maths doesn’t work if every course requires five specialists.
Third, AI is rapidly commoditising the production work that justified specialist roles in the first place. If AI can draft scripts, generate video, build assessments, and package SCORM in minutes, what’s the value of a team organised around those handoffs?
This closely mirrors the diagnosis Cohen made at LinkedIn. His problem wasn’t that individual specialists were too slow — it was that the organisational complexity (more roles, more handoffs, more approvals) had become the bottleneck to organisational growth.
The solution wasn’t to make each specialist faster — it was to collapse the specialisms. So, what might this mean for L&D?
The Rise of the Full-Stack LXD: Three Key Signals
The shift towards the “full stack builder” in LXD is already underway — in pockets, not everywhere, but with enough momentum to take seriously. Here are three signals I’m keeping an eye on in 2026:
Signal #1: Demand is Changing
The skills organisations need are changing faster, more broadly, and more continuously than the traditional L&D model was designed to handle.
It's no longer about building a compliance course once a year or rolling out a leadership programme per quarter. Organisations now need to develop AI literacy, technical fluency, and human skills simultaneously — across entire workforces, updated constantly, often personalised by role or region.
Earlier this month, PwC launched its Learning Collective — a fundamental rethink of how the firm develops talent across 364,000 people and an early example of what this demand looks like in practice.

The programme moves learning beyond traditional courses into client work, everyday problem-solving, and AI-powered coaching — built around 30 skills that deliberately fuse AI capability with human judgment, across 364,000 people. As PwC's Chief People Officer put it: "Learning can no longer wait for the right time, place, role or ladder."
This is the kind of learning that the specialist model wasn’t built to deliver. Continuous, embedded, AI-native development programmes can’t be designed through a chain of handoffs between an analyst, a scriptwriter, a Storyline developer, and an LMS admin. They need designers who can see and steer the whole system — and they need them now, not after a six-month production cycle.
PwC won’t be the last. As AI reshapes every function, the demand for this kind of rapid, whole-workforce capability building will only accelerate — and the gap between what’s needed and what a specialist handoff model can deliver will widen.
Signal #2: Full Stack LXD Pilots are Becoming Common
I am currently working with a number of large corporates which are actively experimenting with pilots of the full stack LXD model — and the pattern is showing up elsewhere too.
Drawing on 50+ case studies and data from 800 organisations worldwide, Bersin’s new Definitive Guide to Corporate Learning (2026) describes the shift from what he calls the “publishing model” — where L&D takes a request and delivers a course three to six months later — to “dynamic enablement,” where small, AI-first teams diagnose problems, design interventions, and iterate in real time.
At Rolls-Royce, the learning and leadership team is piloting AI tools that allow HR specialists to work across traditional functional boundaries — what Bersin explicitly calls creating “full-stack HR people.”
Mary Glowacka, the company’s Head of Learning and Leadership Development, describes how they’re testing the use of AI as a daily “thinking partner” that lets individuals research, design, and share learning interventions that would previously have required multiple handoffs across the team.
At a recent Intellum industry panel, experts from Google and EPAM described pilot programmes where instructional designers are evolving into entirely new roles focused on “AI content architecture and enablement”.
What’s striking to me is that these pilots are all driven by the same three pressures:
Org knowledge is changing faster than courses can be built
Demand for more output without more headcount
The commoditisation of production work that used to justify specialist handoffs, thanks to tools like Colossyan Learn and Synthesia 3.0
AI has removed the old bottleneck — production capacity — and exposed a new one: the specialist handoff model itself. Some organisations are responding by trying to make each specialist faster. Others are piloting the collapse of specialisms entirely.
My bet: the organisations that collapse specialisms first will compound their competative advantage. A full-stack designer iterating a living programme weekly will outpace a specialist team shipping a polished course quarterly — not because the course is worse, but simply because the world moved on while it was being built.
Signal #3: LXD Tools are Evolving
Another trend I’ve spotted is that the tooling landscape for LXD is quietly reorganising along the same lines. You can see this happening across three layers:
First, specialist tools are adding AI without changing their architecture. Articulate 360 — still the world’s most authoring platform — is adding AI-generated outlines, narration, images, quizzes, and localisation. But the workflow is unchanged: author in one tool, export SCORM, upload to a separate LMS. AI speeds up each step without collapsing them. This is the best possible version of the current model.
Second, a new category of AI-first platform is being built for the full-stack LXD. Colossyan Learn, launching this month, is a useful case study because its own strategic evolution mirrors the broader shift.
Colossyan started as an AI video generator — a tool that made one specialist function faster. With Colossyan Learn, the company has rebuilt around a fundamentally different proposition: a course creation system where video is one modality alongside text, visuals, and assessments, all in a single workflow. A designer goes from source document to published, trackable micro-course without switching tools. The roadmap — automated maintenance, version tracking, governance, localisation — explicitly targets the full lifecycle, not just the production step.

In a similar move, Synthesia 3.0 will pull what used to be separate steps — video creation, interactivity, basic assessment and SCORM packaging — into a single environment. Designers can go from script to interactive, trackable video object for the LMS without jumping between tools, quietly removing some of the technical barriers that used to keep “video production” and “course design” as separate specialist workflows
On the LMS side, the movement is just as striking. Sana (acquired in 2025 by Workday for $1.1 billion) combines LMS, LXP, authoring, and AI tutoring in one AI-native environment. Old incumbents like Docebo and 360Learning are moving in similar directions by building content creation, video, coaching, and an agentic co-pilot directly into their platforms.
Third, at the bleeding edges learning is becoming a living system rather than a static course. Bersin's major research this month calls the shift "dynamic enablement" and argues that AI-first approaches are already dramatically outperforming traditional models. Meanwhile, platforms like Cornerstone are embedding AI agents directly into daily workflows, delivering learning in the tools people already use rather than pulling them into an LMS.
New "capability academy" platforms are being built around skills graphs and career paths, where courses are one ingredient rather than the whole product. And there's an unofficial signal that matters just as much: employees are already going to ChatGPT and similar tools for support in the flow of work, bypassing courses entirely. The infrastructure for Stage 3 is being assembled — unevenly, and no single platform delivers it all — but faster than the "roadmaps are pointing there" framing suggests.
The common pattern: the fragmented tool chain that the specialist model depends on is being collapsed into single environments. These tools don’t require the role change — but they remove the technical barriers to it.
What the Full-Stack LXD Role Looks Like
So what actually changes if this prediction turns out to be correct?
Look at how the shift has played out where it’s furthest along. At LinkedIn, Cohen didn’t just merge existing roles — he redefined what the job is. His Associate Product Builders learn coding, design, and product management together. But the five skills he emphasises as most important aren’t any of those craft skills. They’re vision, empathy, communication, creativity, and judgment. His view on everything else? “I’m working really hard to automate it.”
At Anthropic, Cherny describes the same inversion. His engineers spend less time writing code and more time on architecture, requirement specification, and reviewing AI output. The most productive engineer is no longer the fastest coder — it’s the person who can run five AI agents simultaneously, direct them toward the right problem, and catch when their output is subtly wrong.
The pattern in both cases: the premium shifts from production craft to judgment and systems thinking. The ability to do the work matters less than the ability to direct it, quality-control it, and connect it to the right outcome.
The same shift is starting to appear in L&D hiring — quietly and unevenly, but unmistakably.
If your professional value proposition is "I know how to work Moodle" or "I'm fast in Storyline," that's increasingly a claim AI can match. What's emerging instead are six skills that AI can't replicate — and that are remarkably consistent across every field where the full-stack shift is playing out:
1.Vision
Forming a point of view on what capabilities the organisation actually needs, not just responding to requests. At LinkedIn, builders are expected to identify what to build and why, not wait for a spec.
For an LXD, this means shifting from “what course has been requested?” to “what capability gap is actually harming the business?”
2.Empathy
Understanding learners in ways that dashboards can’t capture. Cohen insists his builders talk to users constantly, not just read analytics.
For an LXD, this means interviews, observation, and context — the things that reveal why a programme isn’t working, not just that it isn’t.
3.Communication
Aligning stakeholders around a shared narrative. LinkedIn’s full-stack builders don’t hand off to a separate PM to manage stakeholders — they own the relationship themselves.
For an LXD, this means having outcome conversations with business owners, not just delivery conversations with project managers.
4.Creativity
Designing non-obvious solutions. Cohen’s builders aren’t just shipping features faster — they’re rethinking what to build in the first place.
For an LXD, this means interventions that go beyond courses: workflow redesigns, nudge systems, peer learning structures, performance support in the tools people already use.
5.Judgment
This is a big one. At Anthropic, Cherny’s engineers review AI-generated code for subtle errors the model can’t catch. At LinkedIn, Cohen treats judgment as the master skill that sits above all others.
For an LXD, judgment is the ability to look at a dozen AI-generated learning assets and know which one will actually produce behaviour change — and which one just looks like it will. It’s catching what’s plausibly good but pedagogically wrong. That requires deep learning science expertise, not just breadth across the lifecycle.
And, of course, underpinning all five of these competencies will be a new foundational skill…..
6. AI orchestration — not "can you use ChatGPT," but designing, building and running multiple AI agents simultaneously.
For an LXD, that might mean one agent summarising source materials, another drafting a script, another generating assessment items, and another analysing learner data from the last iteration — all while you steer the whole system toward the right outcome.
In practice, this means a shift from specialised task execution — where each phase of the learning lifecycle is owned by a different person with a different tool — to end-to-end problem ownership, where one designer steers the whole system from needs analysis through to measured outcomes.
What used to require a team of five or six specialists with handoffs at every stage — analyst to designer to developer to video producer to LMS admin to evaluator — becomes one person working alongside AI automations and agents.
The human defines the problem, makes the design decisions, quality-controls the output, owns the stakeholder relationships, and interprets the results. AI handles the production, surfaces the intelligence, and runs the routine operations.
This table below shows what this new “full stack”, human + AI workflow might look like in practice, phase by phase:

The pattern is consistent across every phase: automations handle the making, agents surface intelligence, humans make the decisions.
How to Prepare
The full-stack learning designer isn’t here yet — but the forces that have already transformed product management and software engineering are now starting to reshape L&D. LinkedIn has formalised this with a title, a career ladder, and a training programme. L&D hasn’t got there yet. That’s both the risk and the opportunity.
If you lead an L&D function:
Don’t start with “everyone becomes a full-stack builder tomorrow.” Cohen didn’t. He started with platform (building the internal AI tools), then agents (embedding them into workflows), then a small group of pioneers (the first cohort of Associate Product Builders), then culture (updating performance reviews to include AI proficiency), then scale. The key lesson: this is a change management problem as much as a technology one.
Practically, that means three things:
First, run a full-stack pilot. Pick one programme — ideally something with a short feedback loop, like onboarding or a product knowledge update — and give one designer or a small pod full ownership end-to-end, with AI tools and a mandate to move fast. Measure what happens to speed, quality, cost, and learner outcomes compared to the specialist model. You’ll learn more from one real pilot than from six months of strategy decks.
Second, invest in tooling that collapses the workflow, not just accelerates it. There’s a meaningful difference between adding an AI assistant inside your existing Storyline workflow and giving a designer a platform where they go from source document to published, trackable course without switching tools. The first makes specialists faster. The second makes the full-stack model possible. Get your team hands-on with Stage 2 platforms — Colossyan Learn, Synthesia 3.0, Docebo’s AI Creator, Sana Learn — and evaluate them not on feature lists but on how much of the lifecycle one person can own.
Third, start defining the career path now. LinkedIn formalised the full-stack builder with a title, a career ladder, and a training programme. Most L&D functions haven’t. If you don’t articulate what the full-stack LXD role looks like in your organisation — what it’s called, how it’s evaluated, where it leads — your best people will find somewhere that has. This doesn’t require a complete restructure. It starts with a job description, a set of competencies, and a signal to your team that breadth and judgment are valued alongside specialist depth.
For highly regulated environments, this doesn’t mean abandoning specialist roles entirely. It means shifting the locus of accountability to the person who can see the whole system, with specialists available for depth where compliance and safety demand it.
If you’re a learning designer:
Start building breadth now, before your job description changes. If you only build, start doing analysis. If you only design, start reading your data. If you only take briefs, start having outcome conversations with stakeholders.
Get fluent with AI tools — not as a Friday afternoon experiment, but as core workflow. Practice rapid prototyping: problem → solution idea → testable output (a prompt template, GPT, app or agent) in a single day.
Develop your editorial judgment. This is the skill that’s hardest to see and hardest to replace. When AI can generate a dozen options in minutes, knowing which one is pedagogically sound, culturally appropriate, and strategically right is what separates a full-stack LXD from someone who’s just fast with AI tools.
LinkedIn’s global labour-market research suggests that by 2030, 70% of the skills used in most jobs will have changed. Traditional L&D models built for a slower world aren’t just inefficient — they’re becoming a strategic liability.
Closing Thoughts
Despite all of the change that’s heading our way, there’s also a lot of continuity at play. A key takeaway of all of all of this for me is that our work itself isn’t broken or changing— it’s the process which is broken and changing.
Whatever the impact of AI, learning design remains a five-step process:
research a problem
design a solution
build it
launch it
iterate it.
Over time, companies turned each of these step into dozens of sub-steps requiring multiple teams, reviews and functions. What used to require one builder now takes six months and 10 to 15 different teams just to ship a small feature.
The organisations that create the full-stack LXD path first will attract the best talent and compound their advantage. The designers who start walking it before it’s formalised will be the ones leading pods, owning outcomes, and shaping what L&D looks like in the AI era.
The ones who wait for permission will find the role has already moved.
Happy building,
Phil 👋
PS: Want to start your journey towards becoming a full stack LXD? Apply for a place on my AI & Learning Design Bootcamp



