How I Turned a Keynote into a Podcast Using AI
The practical & pedagogical value of Google's Notebook LM
Hey folks,
Yesterday, I was lucky enough to be invited to give the opening keynote to ~500 L&D professionals at the Swedish Learning Conference in sunny Stockholm.
Here's the TL;DR of my paper:
1. The Exponential Rise of Generic AI in L&D
➤ 86% of L&D professionals are now using generic AI tools like ChatGPT and Co-Pilot in their day to day work. The figure this time last year was 5%.
➤ 50% of those people use AI for learning design tasks like needs analysis, writing objectives, selecting instructional strategies, writing scripts etc.
➤ In L&D AI has a positive impact on creative tasks, writing tasks & productivity tasks - increasing speed by up to 65% and quality by 40%.
2. The Existential Risk of Generic AI in L&D
➤ Generic AI tools make all L&D tasks faster, BUT generic AI struggles with certain tasks, potentially reducing quality by up to 20%.
➤ Among the tasks which generic AI struggles with most are the tasks that more and more L&D professionals are using generic AI for - learning design tasks.
➤ The biggest risk of AI in L&D is that generic AI makes us more efficient at producing ineffective outputs.
3. Prediction for 2025 - The Rise of Specialised AI Tools for L&D:
➤ As a result of the risks of generic AI, I predict the rise of specialised AI tools for L&D - tools trained on industry-specific data and designed specifically to augment L&D workflows.
➤ Specialised AI tools in other industries, e.g. coding and medicine, have already proven their ability to augment the work of professionals, increasing speed by 50-70% and quality by 70-80%.
➤ My early research & testing suggests that, compared with the use of generic AI tools, a highly specialised AI copilot for L&D (Epiphany) can increase the speed of the instructional design process by 75% while increasing the quality of outputs (ready-to-build-training-designs) by 80%.
AI-Powered Research-Sharing
On my way home, I was wondering how I could share the key take aways from the keynote with you in a way that was dynamic and interesting. Then, I remembered Google’s latest AI release: Notebook LM.
Described as, “Your personalised AI research assistant”, Notebook LM (which is powered by Google's most capable model, Gemini 1.5 Pro) enables users to upload either their own or others’ content, then analyse, summarise, explore and interrogate it. As part of its suite of research tools, it also allows you to turn any content into a podcast to listen to a “conversational” version of the information you have gathered.
I wondered: what would happen if I upload a PDF of the slide deck that I used for my keynote to Notebook LM? No detailed script, no notes, not even speaker notes - just a PDF.
The answer? In ~2 mins, Notebook LM turned my deck into:
A summary
A briefing doc
A study guide
A set of FAQs
A “podcast”, hosted by two AI hosts
You can check out the original deck the podcast and FAQs in the appendix, below.
But first, how well did Notebook LM do? I think it did a pretty great job. In just a couple of minutes with not a huge amount of context, it captured the overall key themes of the and “vibe” of my keynote really well, picking up on some of the nuance of my research and messaging.
At times, there are some exaggerations and also some simplifications. At one point, for example, the concept of the “Jagged Technological Frontier” is attributed to me (it absolutely was not my research). Lesson learned: working with AI tools which don’t consult broader context has both risks and benefits. works only with what it is given, rather than with broader context.
As it only read a PDF of my slides and as I I didn’t have a script, there are also inevitably some missing details. Lesson learned: the more detail we give to AI, the better quality the results that we get out of it.
Overall, through, Notebook LM did a pretty impressive job of rapidly turning a relatively inaccessible, decontextualised slide deck into an impressive set of notes, FAQs and - perhaps most wildly - an audio podcast.
This experience got me thinking about the potentially interesting pedagogical implications of being able to use AI to re-formulate and re-purpose content. Here’s where I landed:
Computational and Learning Efficiency: Multimodal learning - i.e. using multiple modes or channels to present information, has shown to significantly outperform unimodal learning in terms of computational efficiency, solving complex problems faster. This is because integrating multiple sensory inputs can lead to more efficient algorithms, even enabling multimodal algorithms to solve problems that are intractable for unimodal ones (Zhou, 2023).
Enhanced Retention: Multimodal learning, particularly when combining visual and verbal inputs, improves vocabulary uptake and retention. This benefit is often explained through Paivio's Dual Coding Theory, which posits that combining verbal and visual elements enhances memory encoding and retrieval (Boers et al., 2017).
Improved Accessibility, Engagement and Attention: Research suggests that multimodal inputs attract more attention from learners, which can lead to better outcomes in terms of both engagement and retention. This is observed in the use of multimodal tools in education, such as combining text, visuals, and tactile tools in classrooms to support varied learning styles (Massaro, 2012).
TL:DR Using AI to reformat and repurpose content seems to be a pretty powerful tool for both sharing information and, when used intentionally, fuelling the learning process.
I’m interested to hear if you agree! Take a look at the original deck and the AI-generated content below and let me know what you think in the related LinkedIn post!
Happy innovating,
Phil 👋
PS: If you want to get hands-on and test these and other AI tools in a safe and supported environment, join me and other learning professionals like you an upcoming cohort of my AI-Learning Design Bootcamp.
_____________________
Appendix: The Content
The Deck
The original 29 slide deck which I uploaded to Notebook LM:
AI-Generated Podcast
The 29 slide deck turned into an 11 min podcast by AI:
AI-Generated FAQs
The 29 slide deck turned into a set of FAQs by AI:
1. How are AI tools currently being used in Learning & Development (L&D)?
Currently, L&D professionals are using AI tools for various tasks, with varying adoption rates:
Learning Design (50%): Generating design ideas, writing learning objectives, defining instructional strategies, and defining course content and learning activities.
Content Creation (40%): Scripting content and automating content creation using text-to-image, video, and voice AI tools.
Research (35%): Learning about design topics, finding relevant research papers, and defining instructional strategies.
Admin (28%): Automating tasks like writing emails, generating reports, and making edits.
2. What are the "Big Four" use cases for AI in L&D in 2024?
The "Big Four" refer to the dominant areas where AI is making significant inroads in L&D:
AI-Augmented Analysis: Collaborating with AI to analyse learning needs, define business problems, and outline learner journeys.
AI-Augmented Design: Working with AI to rapidly develop instructional strategies, recommend activities and content formats, and draft initial content outlines.
AI-Augmented Development: Using AI to generate fully scripted course content, optimise content design based on learning science principles, and prepare materials for production.
AI-powered Scrap Learning Reduction: Leveraging specialised AI tools to significantly decrease the amount of learning content that is created but never implemented, leading to better ROI on L&D efforts.
3. What is the "Jagged Tech Frontier" and how does it relate to L&D?
The "Jagged Tech Frontier" represents the uneven capabilities of current AI tools. Some tasks fall "inside" this frontier, where AI excels and significantly enhances speed and quality. Other tasks are "outside" the frontier, where AI struggles and might even decrease quality despite increasing speed.
According to Dr Hardman’s research, in L&D this looks as follows:
Inside the Frontier: Creative tasks (e.g., generating ideas), writing tasks (e.g., drafting descriptions), and productivity tasks (e.g., writing emails, generating content from text).
Outside the Frontier: Complex needs analysis, strategic instructional strategy development, nuanced instructional design, and learning content creation requiring deep understanding of learner motivation and engagement.
4. What is the difference between "Generic AI Tools" and "Specialised AI Tools"?
Generic AI Tools: Trained on massive, general datasets (like the internet), these tools can handle a wide variety of tasks but lack deep specialisation in any particular field. Example, ChatGPT.
Specialised AI Tools: Trained on focused, industry-specific datasets, these tools excel at a narrower range of tasks within their domain, offering expertise and efficiency for specific workflows. Examples include tools for coders (Cursor) and medical professionals (Hippocratic).
5. What might a specialised AI tool for L&D look like?
"Epiphany" is a specialised “co-pilot” tool for L&D which Dr Hardman has prototyped in order to test the impact of specialised AI on L&D:
Training: Trained by experts on instructional design best practices and datasets, incorporating current research and standards.
Features: Assists L&D professionals in real-time with needs analysis, design brief creation, instructional strategy development, content outlining, and even storyboard generation.
Benefits: Aims to increase the speed and quality of instructional design work, while reducing the cost of analysis and design. Research suggests it could drastically decrease "scrap learning" by aligning content creation with real-world application.
6. What are the risks of prioritising speed over quality when using AI in L&D?
While AI can significantly accelerate all L&D tasks, solely focusing on speed can be detrimental. Using AI for all tasks - including those "outside the jagged frontier" - can lead to:
Lower quality output: Content might be generic, lacking engagement, or misaligned with learning science principles.
Ineffective learning experiences: Learners might not achieve desired outcomes, wasting time and resources.
Erosion of trust: Stakeholders may lose faith in L&D's ability to deliver impactful learning solutions.
7. What are the key takeaways for L&D professionals navigating the rise of AI?
Be mindful of the "jagged frontier": Understand which tasks AI can effectively support and which tasks require human expertise.
Prioritise specialised AI tools: Whenever possible, choose tools specifically designed for L&D to ensure higher quality and relevance.
Focus on quality & impact: Don't sacrifice quality for speed. Use AI strategically to enhance, not replace, your expertise as an L&D professional.
8. How can I learn more about the specialised AI tool "Epiphany" and the potential impact of tools like this on L&D?
To stay informed:
Register your interest in Epiphany here.
Follow Dr Philippa Hardman.
Engage in broader discussions on AI and L&D.