Registration for Inspire 2024 is now open!

Register now

Generative AI for L&D: Today’s trends, tomorrow’s tech

• 4 min read

If you missed LT UK, don’t worry; we captured the highlight session. Presenting to a packed audience with standing room only, Giuseppe Tomasello, Docebo’s VP of Artificial Intelligence and founder of Edugo.ai talked about:

  • What’s happening in the world of L&D
  • What’s changing in the world of artificial intelligence 
  • What Docebo is working on to capitalize on these new innovations 

Read our summary below for a high level recap of what you missed. Or check out a recording of the full session HERE.

A lot is changing in L&D and artificial intelligence

10 years ago, internal training and talent development accounted for more than 2/3s of what Docebo customers were doing on our platform. Today, those use cases only represent 1/3 of Docebo usage, as high performing L&D teams train non-employee audiences like customers and partners. 

On a similar timeline, artificial intelligence has gotten really really good at a number of simple human tasks. Notice the large spikes around 2010:

Tech evolution is happening fast. Just 20 years ago, we started using computers for simple human activities like handwriting recognition. Soon, technologies like Optical Character Recognition (OCR) were helping transcribe images of handwriting into editable text on computers. 

Today, we’re throwing way more at artificial intelligence than deciphering sloppy cursive. Common use cases where AI can perform at human standards include: 

  • Handwriting recognition
  • Speech recognition 
  • Image recognition 
  • Reading comprehension
  • Language understanding

 

Forward jumps

There are two big innovations that supercharge AI’s impact on learning today. The first is one you’ve heard of but the second is something Docebo’s been working on behind the scenes. Let’s start with a brief overview of Large Language Models (LLMs—the one you’ve heard of) with a linguistic spin.

In fancy terms: Large language models are able to encode a semantic model of the world. 

In simple terms: LLMs use math to approximate how well two words fit together. 

Consider this statement, famously written by theoretical linguist Noam Chomsky:

“The colorless green ideas sleep furiously” 

While grammatically perfect, this statement makes no semantic sense. The words follow our expected rules of grammar, but they don’t mean anything. This is the problem that LLMs help solve. 

The image above represents how LLMs use big datasets to create a map of semantic relationships between words. If you’ve used the most popular generative AIs, you’ve seen the word-by-word drip that happens with these applications. Through this semantic mapping of words, LLMs return language to you that sounds less like Chomsky’s nonsense statement and more like the language you expect to hear from another person. 

But of course, this has limitations. Especially when it comes to generating learning content. 

Introducing the Knowledge Engine

This is the second big innovation we mentioned above, because frankly, today’s LLM-powered content creation tools are…fine. Not horrible. Not breathtakingly impressive. Useful, sure. But regular, normal, everyday fine.

As shown above, LLMs are more than capable of generating text. But we think learning content deserves better than that. And we know that high performing L&D teams set higher standards for their programs. 

AI-generated content is hilariously boring.

Learners are world-class experts at sussing out boring content and AI-generated content is hilariously boring. That’s because when tools just wrap themselves around an existing LLM, they inherit the ‘averageness’ of the content that gets created. 

This was our impetus to build a middle layer between your Docebo tools and LLMs. We call this layer the Knowledge Engine. 

The Knowledge Engine has two different components:

  • Embedded pedagogical resources, like smart templates and content design principles that we’ve defined and encoded. In simple terms, this means that everything created with Knowledge Engine is optimized for learning and is ‘pedagogically sound.’ 
  • Knowledge management defined by you, the user. 

By creating a middle layer between the presentation layer (like an authoring or simulation tool) and the LLM, we give businesses and users the power to create tailored learning content while still having complete control of their proprietary information and documents. This is the type of context that instructional designers and content creators bring to life through the material they make, and it’s the type of context that mainstream LLMs can’t currently capture. 

The Knowledge Engine provides context and helps you create a final draft to review, not a first copy to rework.

This also provides a level of security and control that IT, security, and governance teams will love. Company information kept in the Knowledge Engine is not being used to train any of the LLMs implemented by Docebo. Instead, the Knowledge Engine provides context and filters out the generic, average content so you get a final draft to review, not a first copy to rework. In general, Docebo does not use customer data to train LLMs or any other Docebo model. 

One last thing…

We’ve always been excited about AI—even before it was a hot craze. Our first AI creation tool released in 2021, when it was still quite difficult and time-consuming to do. We’re also excited about the many large language models out there. And the dozens more that are sure to pop up next week.

That said, Docebo is a learning platform built for scale. Building strong, modular systems takes time. As we’ve been developing our AI tools for release later this year, we’re focusing on building a flexible, reliable system that lasts. We’re ready to use different LLMs over time, allowing us to keep up with the rapid development of these technologies. Because technology will rapidly develop and we want to equip our customers to stay on top of it.
Anyone can plug into a public API. And many organizations do that and then call it a day. But leaning so heavily on a single model introduces fragility to the system. Managing and testing multiple AI services and LLMs behind the scenes is the tricky part. But we think it’s worth it, because it creates a more powerful, stable, and secure tool for our customers.

We’re excited to share more with you in the coming weeks and months about what we’re building. And we’re committed to continually improving our AI offering based on your feedback and experience. 

Look out for more on the Knowledge Engine, and our upcoming AI Authoring tools later this year. 

The Docebo AI Team