Assisted Vs Autonomous AI for Educators
How different AI models could impact the what, why & how of our roles.
This week, I’ve been in Madrid speaking with Learning leaders at the UN about AI & Learning: where we are and where we’re (potentially) going.
Together we explored the impact of two AI models on Learning & Development: assisted AI, which enhances human capabilities in learning design and delivery, and autonomous AI, in which enhances and automates the learning design and delivery process.
We explored what assisted and autonomous AI looks like in practice by applying each to the ADDIE process. Our conclusion was that the future of AI in L&D and education more broadly hinges far more on our ability as humans to embrace and adapt to change than it does on technological advances.
Here’s a summary of our conversation and conclusions.
Assisted & Autonomous AI in L&D
When it comes to learning analysis, design, delivery and evaluation there are two lenses through which we should think about AI:
AI as Assisted Intelligence: in this model, we use AI to augment human decision-making & actions to enable us to design, deliver and evaluate more personalised and adaptive learning experiences faster. Think of this as using AI to oil the existing learning design, delivery, implementation and evaluation machine - like adding fuel injection to a pre-existing car.
AI as Autonomous Intelligence: In this model, we surrender more control and decision making power to AI. Think of this as AI taking over all aspects of learning design, delivery and evaluation with a “human in the loop” to validate and QA what is created. Rather than oiling the machine, this requires us to rethink the purpose and operations of the machine from the ground up.
Assisted Intelligence + ADDIE
Next, we explored the question: what does Assistive & Autonomous AI look like in practice for educators? We started by focusing in on assisted intelligence.
The goal and purpose of assistive intelligence is to enhance and supercharge human capabilities, not replace them.
In this scenario AI is a productivity tool in the hands of a human with the purpose of making them faster and - perhaps - enabling them to focus on tasks where they can add most value, specifically things like:
deeper needs analyses;
research / getting to know the topic;
building relationships with subject matter experts;
making data-informed design decisions.
Assisted Intelligence is the most common AI use case on the ground right now; ~70% of learning professionals I have interviewed reporting that they regularly use AI to assist their day to day work, especially when it comes to content creation.
Tools like text to video, text to audio, text to image, text to quiz and translations tools are particularly popular.
Here’s a summary of the most common use cases I have come across:
One of the key principles of Assisted Intelligence is its reliance on human expertise. In this model, at least for now, the quality of AI’s assistance depends significantly on the quality of a) the inputs selected by humans and b) the prompts provided by experts.
Example: if I ask ChatGPT to design me a course, its instructional design skills are poor. However, if I prompt it using instructions based on advanced instructional design skills (i.e. if I “teach” AI how to make design decisions), it does a great job.
Autonomous Intelligence + ADDIE
Autonomous intelligence is less about oiling the human-operated machine and more about handing control over of how the machine operates and develops.
Autonomous AI enables us to imagine a - perhaps utopian, perhaps dystopian - world where AI systems make decisions and take actions without explicit human intervention at each step.
In the L&D world, the power dynamic shifts from the AI supporting the human to move through the design, delivery and evaluation process to AI leading an iterating and improving the process on the fly based on real-time data.
Here’s what it might look in practice:
In the autonomous AI context, the role of the human is to feed and QA the machine, but the machine takes ultimate control of using data to design, deliver, iterate and evaluate the impact of learning.
AI + Education: Utopia or Dystopia?
When we talk about AI & L&D, we need to ask ourselves: what sort of AI are we talking about? And what is the impact on both learning and the people who design, deliver and evaluate that learning?
Both assistive and autonomous AI offer huge opportunities for us to improve the speed, quality and impact of learning experience design, delivery and evaluation. But it also comes with some risks. Here are just a few:
Risks for Human Learning Professionals
In the post AI world, the role of the human learning designer or educator changes dramatically:
Train and manage/prompt AI
Assess & QA output
Track impact and other data patterns
Keep abreast of new AI technologies
If we are to embrace and leverage the power of AI, it requires us to be open and willing to change significantly both what we do and how we do it. We need to be open to a change in our identity from teachers of humans to teachers of AI.
If we’re willing to do this, we also need also need to be empowered to embrace AI: we need significant and training and support to develop knowledge and skills which are significantly different from where we are today.
Risks for Equity
TL;DR: the data that we feed AI influences its output. AI is built by humans who decide a) what to train it on and b) how it makes decisions.
In order deliver AI in education responsibly, we need to ensure that the data that we feed into AI is reliable and representative equally of all learners. The risks here are very real. A study published in Nature Medicine highlighted a significant under-diagnosis bias in AI algorithms when applied to chest radiographs. Research shows that because certain subgroups, including non-white populations, were less represented in the data, they were significantly more likely to be under-diagnosed.
Risks Associated with Learner <> Machine Interaction
Here, I’d like to raise a question which will be familiar to you all as learning professionals: If we build it, will they come [and learn]?
We make a lot of assumptions here. We assume that if we build tools like the 1:1 AI tutor:
learners will respond positively, and want to engage with them;
their impact will be the same as a human equivalent;
they will have the same impact on all learners of all kinds.
Many companies and investors are placing a very big bet on this, but the truth is that we just don’t know yet.
There’s still a lot of research to do before we can validate these hypotheses. One thing we know from other contexts is that human machine interaction is complex and doesn’t follow the same rules as human to human interaction.
For example, medical research has shown that when patients are told that an AI can give them a more reliable diagnosis of their illness than a human, they would rather have a human diagnose them.
So, the big question here is, even if in theory autonomous AI can deliver better outcomes, will learners want them? Will learners be happy and comfortable with AI powered teaching and learning? Or, like in the medical context, will they demand a new sort of hybrid approach that combines the computational power of AI with the things which we perceive to be uniquely human and critically important - things like integrity, trustworthiness, compassion
Only more research and time will tell.
We’re already inhabiting a space where AI is being used to assist us to design, deliver and evaluate learning experiences.
Given the extent of the impact of AI assistance on the speed, quality and costs of learning analysis, design, delivery and evaluation I don’t doubt for a second that we will continue to see AI experts and educators experimenting and building tools designed to assist instructional designers and others across the end to end process.
This “oiling of the machine” is a comfortable and palatable sort of AI-innovation for humans who, while assisted, retain fundamentally the same role, responsibilities and authority in their day to day work.
Whether we move beyond the power of AI to oil the machine and leverage the power of autonomous AI to design and deliver learning experiences is TBC.
One thing we know for sure is that AI could automate the learning analysis, design delivery and evaluation process. Whether or not we ask it to and what impact it has on learners depends less on technology and more on our willingness and appetite as humans to change how we live, work and learn.
PS: Want to get hands on with me and learn how to leverage AI in your learning analysis, design, delivery & evaluation? Apply for a place on my AI Learning Design Bootcamp.