IT’S ALL ABOUT PERSONALIZATION OF LEARNING OUTCOMES
Last year, I started a journey to better understand the technologies and concepts around artificial intelligence, or augmented intelligence (AI) and it’s use in customer education. I’ve been working on researching more, and decided it was time to bring others on the journey.
It started with identifying the levels of availability of working with AI in learning, and in subsequent content, I’m going to delve more deeply into each of the technologies and how they apply to customer education. The goal is the consumer grade experience with heavy personalization.
The three levels of AI
It all really starts with chatbots. They’re everywhere. Most websites have a way to leverage an initial discussion, but the backend for the bots is all about natural language understanding, identifying the context and responding to questions or continuing discussion. This follows with natural language generation to be able to carry on that conversation in whichever language is in use. The work in this area is in building the library of conversations that will be in play for the chatbot to use in conversation. These skills may be built on a decision tree structure to direct the bot in the conversation. At some point, the conversation may become too complex for the chatbot to continue, and that is the point at which the questions are handed off to the next level, which may be a real person, or knowledgebase documentation. The possibilities are extensive, but require planning. Will the response be the same for interactions based on localization? Most technologies at this level have some degree of machine learning, meaning that increased interactions will increase the capabilities of the bot to respond. There are several low-code, or no-code tools available to add to existing learning tools. Many products are including the technology within their toolkits, most often as a “chat with Sales” or “chat with Support” button. As formatting, or design, don’t forget about the persona of the chatbot. Should it be an avatar? Or look like a real person? Most likely, the decision will be based on the culture of the organization, as well as localization efforts, so that the persona that appears for Kristine in the US may not be the same as the persona that appears for Divya in India.
Moving to the next level usually includes voice. It can be just a chatbot with audio, but in many cases, this next level is an agent or software bot. The difference is action. This next level assumes that an action will take place. These audio tools are also very well used: Alexa, Siri, Bixby, and more. In addition to basic questions and answers, there is also the ability to direct the software bot to complete a task like adding an item to a grocery list or playing a Spotify playlist.For customer education, where the discussion is how to bring a consumer-level experience to learning, this tool could play a huge role. Many of them work along the same lines as the chatbot in that the skills are built with specific responses in mind. The machine learning will assist in making the tools “smarter” and even more responsive. For customer education, this can mean voice interactions in the courses, or course catalog. Some of the larger discussions around planning should include who would own these activities and should there be headcount associated with it. Asking questions of existing vendors should also be part of the planning process.
All of these transactions will also lead to data that will then need to be analyzied, probably using machine learning.
And next up are search algorithms. These allow a combination of searching across mulitiple sources – education content in a catalog, knowledgebase artiicles, support articles or forum discussions. Algorithms bring their own set of issues. The predominant discussions, currently, revolve around making sure that the algorithms are tested for consistency, equitablility and validity.
All of these tools, together, can lead to a true consumer grade experience. This diagram shows the various combinations available to build out different levels of collaboration and/or automation. The best place to be is the center of the Venn diagram,
combining the cooperate and autonomous agents gives us collaboration agents – chatbots, or smartbots. Cooperation and learning combine to collaboration learning agents that feed potential learning opportunities through in app options, presenting learning in the flow of work. Autonomous and learning combine together to create interface agents, which could also be used in app to create the user interface to present the learning opportunities. The point at which all of these technologies join, the center of the Venn diagram, is the perfect utilization of the technologies. It gives the consumer grade experience that is seen with Amazon, Netflix, and other streaming services. Basic definitions up front on user preferences, roles, geographies, and more, then present content that is viable for that combination. The system continues to adapt to learner preferences as they happen, which refines the content presented. The ultimate learner experience. Complete with federated search, to find content anywhere, and allow the learner to look for content outside the specifications of their profile.
Decisions and trends will determine whether in app or bots, or a combination of both become the initial starting point for learning opportunities. If you’re interested in more reading on the subject, check out the following links:
Next up, a deeper dive into chatbots and how they can be used in learning, followed by a deeper discussion on using voice, and then a deeper look at machine learning and what that can mean. By then, it will be time to re-evaluate the state of AI in learning and customer education.