How AI Virtual Assistants Execute Actions
Picking up from where we left off with Natural Language Processing (NLP) in AI Virtual Assistants, we continue to dive deeper into the inner workings of how Aisera is able to understand such a wide range of intents with such high accuracy. After the Aisera AI Virtual Assistant has received a request from a user, the NLP and NLU capabilities are instantly engaged to decipher the intent behind the utterance. Using techniques such as tokenization, lemmatization, and tagging to classify the request and map the intent to an ever-growing ontology and taxonomy.
The ontology and taxonomy are varied across different domains to broaden the scope of understanding and allow the AI Virtual Assistant to remain nimble and able to catch any incoming request without needing to be specifically trained on any individual domain.
In rare cases where the request cannot be mapped to any existing intents already in the global taxonomy, an exception handler manages the outlier requests and informs the user that the AI Virtual Assistant cannot support their request at this time, in addition to offering options like creating a ticket or transferring to a live agent so as not to lose momentum in addressing the user’s needs. Assuming that this is not the case, and the exemption handled is not needed at this time, the next step in the process is to begin fulfilling the request based on the AI Virtual Assistant’s understanding of extracted intents and entities.
Furthering the Conversation
The impetus behind any request, be it fact-finding or action-oriented, is that the user wants to accomplish something. A Dialog Manager is responsible for pushing the process along at this point. What makes the Dialog Manager so important is that this technology past and present interactions and information (ask for additional clarification if is required), and determines a set of next best actions to complete the request.
Simply put, Yt+1 is the best action the AI virtual assistant can take during conversations with the user based on the historical information gathered since the conversation started Xt (which is really Xt , Xt-1, Xt-2 … Xt-n). In this context, the next best action Yt+1 can be either seeking clarification to avoid a misunderstanding or if the intent is understood, fulfilling the request with the most relevant information or action. The Dialog Manager uses an API call to pull the most useful prediction models for determining the next best action for the request at hand. After the request is understood and the prediction for the next best steps is complete, the AI Virtual Assistant follows up based on one of three intent types: action, information, or casual. “Action” intents dictate that the AI Virtual Assistant must perform an action, whereas “information” intents prompt the AI Virtual Assistant to search for, retrieve, and serve relevant data from a knowledge base or trusted external sources. A “casual” intent classification means that the Assistant should respond with informal and easygoing small talk.
Doing the Work
The three intent classifications are all a part of a larger Action Execution module, naturally used to complete all actions associated with the user’s request. The module is aided by User Profiling and User Preference Services, allowing the Aisera AI Virtual Assistant to comprehend parameters like a user’s eligibility for provisioning a piece of software or what data has previously been served to similar users. This loop is both dynamic and self-adaptive, continuously monitoring all user interactions in real-time, in a technique known as reinforcement learning. What’s more, is reinforcement learning allows the AI to catalog the associated user preferences on an intent-by-intent basis, providing a robust learning model that captures the nuances endemic to a given user profile, their preferences, and their intents to allow the AI Virtual Assistant to offer more well-received resolutions to each user request. Once the appropriate actions have been executed, it is time to notify the user of the results, and that’s where Natural Language Generation comes into play.
Talking the Talk
When the time comes to communicate with the user, Aisera’s AI Virtual Assistant leverages the Natural Language Generation module in order to offer conversational and human-like responses over the course of the interaction. This module draws from a bank of response templates, with each template containing a number of different phrases that all impart the same meaning. This circumvents the use of outmoded “phone tree” type responses, boring users and leading to the immediate escalation to a live agent. To further develop rapport and between the AI Virtual Assistant and the end-user, the NLG module utilizes the user sentiment analysis results from the NLU module to provide responses that are consistent with the user’s mood and emotional state, as opposed to responding with apathetic blanket responses. The dialog flows between the user and system all follow this pattern of utterance processing to response generation, repeating every turn in the dialog, and all recorded by the Dialog Manager to power the reinforcement learning process. The simultaneous use of these techniques combined with massive global ontology and taxonomy covers an extensive amount of ground, enabling the Aisera AI Virtual Assistant to understand open-ended goals as users interact with them, much like as they would with a live agent.
Refining the Process
Just as with any process, it takes time to develop and tailor the solution to achieve the best results. Already lightyears ahead of other solutions, Aisera continuously and autonomously hones its understanding of intents, their associated actions, and domains from user requests to become better suited to the individual needs of the user and industry of the business. In the next blog in our series on NLU, we will cover the extensive language models, different domain ontologies and taxonomies, and a few more of the unique qualities that make Aisera AI Virtual Assistants the solution of choice for discerning businesses.
What it Means for Users
NLP in AI Virtual Assistants is only one piece of the puzzle, and the example provided merely scratches the surface of the exact inner workings of Aisera’s AI Virtual Assistant, but it does offer a more technical glimpse into how inbound data is handled by the NLP module. With NLP and NLU, businesses gain the ability to instantly engage users without resorting to the narrow scope of an archaic scripted dialog flow. NLP and NLU enable Aisera to be like the World Cup goalkeeper – catching incoming requests, no matter the speed, spin, or angle of attack. In our next blog, we will take a look under the hood of how Aisera built its world-class Natural Language Understanding (NLU) module and the benefits to businesses and end-users alike.