top of page

Integrating Augmented Intelligence in Customer Education

Chatbots and digital adoption platforms to launch personalization

Last time, I talked about the concepts of augmented (artificial) intelligence, and the multiple types of activities that comprise AI in customer education. If you’d like to catch up, read that article here. My first deeper look into the structure and integration with the various tools that are available had to really start with chatbots, because they are so common across the consumer ecosystem.


Chatbots can be the entry into using machine learning and augmented intelligence in your customer education practice. Many people have established interactions with chatbots from other consumer-level experiences, from banking and tax preparation, to ordering online. First level questions are now handled by the automation of the chat. The biggest benefit is the ability to leverage the automation and build stronger resources at the higher levels of interaction.

Types of chatbots

1. Rule-based Chatbots

These chatbots are the entry-level option for working with chatbots. They are designed as simple true/false or yes/no options that take a discussion down a predefined path. They are not usually very complex to set up, but they may take time due to the multiple levels of true/false in a particular skill or workflow. Once the end of the workflow is reached, either due to success in the response, or because the issue is more complex than the workflow design, there is usually an option to move to the next level.

2. AI-based Chatbots

These types of chatbots are more complex because they have machine-learning as the base. Complex does not have to mean hard. The chatbot understands the language, the context and the intent of an open-ended question, and presents a response. The power of this type of chatbot is that it learns. It builds on its understanding of the language, the context and the intent to create better, more detailed responses with future interactions.

3. Low-code or No-code

Low-code or no-code is exactly what it says. There is no need to know about coding to build the chatbot. It usually means using a visual interface to drag and drop either rules or prefabricated templates that you can configure. So if you’re not a developer, yeah! One constraint to this type of development is customization will be minimal, if it’s available at all.

Architecture of chatbots

1. Contextualization

Consider contextualization the engine that drives the natural language processor (NLP). It interprets the input from the user and translates it into organized input the system can understand. Here’s where the algorithms come into play. Machine learning (ML) does the matching from the intent (the request from the user) to the chatbots designed intent list. The NLP engine is made up of two parts:

  • An intent classifier is the matching mechanism that sends the request from the user to the right response.

  • An entity extractor is the identifies the keywords in the user request.

2. Front End

This is the component of chatbots that most of us would recognize. It’s the end-user interface for the interaction. Many chatbots will have an avatar or image available within an application. Once the user clicks on the avatar to begin the interactions, the workflow begins.

Decisions to make

  • Avatar persona: There are two current trends in this area – cartoon robots and pictures of real people. There are certainly more choices than that, but think of those as a spectrum from which to select.

  • Language: Should the base language be English? Or something else? Or more than one? What’s the tone? In most interactions, the tone is very conversational, regardless of base language.

  • Interface: What tool will present the chatbot?

Those decisions will roll into how the chatbot actually works regarding pop-up windows, or file drawer expansions, or other techniques that are available.

3. Server Path

This is a traffic server that routes user requests to the proper components on the back end. They may be referred to as either node servers or traffic servers. Different bot designs will utilize different services to handle the traffic. Low-code/no-code tools will present traffic servers as a part of the package, as will front end tools.

4. Integrations

Since these low-code/no-code tools are designed to be front ends to something else, you have control over what that something else is – your cloud learning system, your customer management system, or any other system that needs the chatbot at the leading edge of the consumer transaction.

Chatbots in customer education

The integration piece is where we could start to see the use of chatbots in customer education. In 2017 Learndot listed 5 ways that AI and chatbots would impact customer learning. The list is still valid:

  1. Influence programs and assets with chatbot data

  2. Create VR training simulations for customers

  3. Reinforce customer learning through chatbot responses

  4. Automatically update training programs

  5. Measure the effectiveness of training programs

The difference between the report in 2017 and today is the advancement of digital adoption platforms (DAP). The proliferation of tools that allow you to create in-app guidance via documents, walkthroughs, or step-by-steps through contextualization brings another level of power to automation.

Earlier, I mentioned that using chatbots would require you to decide where to introduce your chatbot to the audience. With sales, we usually see it on the landing page of a website. With support, it usually launches from the support landing page. If we decide to use it as part of our customer education, where does it launch?

  1. From within the application, yet separate from the DAP tools? I like this option least, since it sets up a dual path to help the learner.

  2. From within the application? You would need to decide on the real estate for the avatar to reside, or how it would launch. Should it then launch the DAP as part of the intent structure? This is not my favorite, either. I’m not a fan of keeping the chatbot visible on a page, and would prefer not to go through multiple mechanisms.

  3. From within the DAP? Using the launch mechanisms from the digital adoption platform to activate the chatbot is my favorite, because then I can use the contextual DAP to answer the immediate need and launch the chatbot almost as a level two support tool to direct the learner to additional learning.

What’s next?

Watch for the technology to continue to advance. I watched a video today of completely animated, computer-generated people interacting. Significantly more realistic than most video games, with the added attraction of “emotability”. That construct that removes the computer aspect from the interactions.

Check out my technology crushes of the moment to see where some of this is headed: to personalize user journeys at scale from script to screen with no inbetween

They’re both amazing in the use of the technology. And join me next time for a discussion on natural language process (NLP).

Further Research

bottom of page