Introduction

Chatbots are here, and they came to stay. They provide a new natural way to interact with the users, and also allows you to be in the channels that the users already use.

Once of the frameworks to build chatbots is the Microsoft Bot Framework, and one of the programming languages that you can use is Node.js. And during this article we will visit how to build an intelligent chatbot using both.

Channels

There are different channels where your chatbot can live, including mobile apps, webs or mail. And of course, messaging channels. The problem with this is that every channel has a different way for communicating, and this is a problem if you want to target several channels when developing. This is one of the advantages of Microsoft Bot Framework: you deploy it and you connect it to the channels, and Microsoft is the responsible of connecting and translating the messages to a common message format. And when you deploy the Bot Channel Registration, you have a simple GUI to add channels to your bot:

Two important notes here:

There are many persons that believe that bots developed Microsoft Bot Framework can be only deployed to Azure. You can deploy your chatbots to whatever platform you want (Google Cloud, Digital Ocean, AWS,…) and you connect it to the Bot Channel Registration

Bot Channel Registration is free for non premium channels. The premium channels are the Web and Direct Line API. Premium channels can be deployed as F0 (free but limited to 10k messages per month) or S1 (0.42$/1000 messages). It’s important also that a message is any communication from client to chatbot but also from chatbot to client.

UI, Clickbots, Intelligence…

Chatbots today are not only “text” based. They started to add some UI elements to interact with the users, like buttons, images, location, carousels,… and they are still evolving today.

You can check the capabilities of Adaptative Cards, the proposal of Microsoft to being able to use those UI capabilities in a common way for different channels.

With such powerful tools, you can make a good chatbot that interact with the user even without having artificial intelligence. This kind of chatbots where the user interact by clicking buttons (or other UI elements like calendar, comboboxes, …) but without understanding the human language are commonly known as clickbots.

But there is a way to add intelligence to your chatbots: NLP (Natural Language Processing). Natural Language Processing is composed of two parts:

NLU: Natural Language Understanding. Maps utterances (phrases as human says that) into intents (action that this utterance means). Also, it should have a NER (Named Entity Extraction).

NLG: Natural Language Generation. Produces meaningful sentences

We will understand it better with one example:

Here the utterance is I want to travel to Barcelona tomorrow, that the NLU interprets as the intent user.travel. Also two named entities are recognized. Think as the intent as the name of an action of the chatbot, and the named entities as the parameters needed for this action.

node-nlp

There are many online NLP products, like DialogFlow (Google), Wit.ai (Facebook) or LUIS (Microsoft). You can find comparatives over internet. One of the big differences that makes DialogFlow the leader are the use of contexts, so the process of the intents can change based on parameters of the conversation, and providing tools to define the answers of the users. This is very important because LUIS does not provide those kind of features. You may also have heard about QnA Maker. QnA Maker is pretty good to do bots from your FAQ, but it’s important to understand that is not an NLP, and if you test it in deep it seems that it does not uses artificial intelligence but string distance algorithms: in my case I tested it training a FAQ in black tongue from The Lord of the Rings, and the example worked, so internally it does not seems to interpret different things based on the language.

There are also on premise products like RASA or Snips, so you can deploy them inside your company. The reason of having on premise solutions use to be because of legal or security: you have to avoid sending confidential information from your users to third parties. But you can have other reasons, like privacy, customization, price,…

In my case, I will be using node-nlp, that is a library written in node based on Natural. The reasons to add it as a library are several, but basically it allows you to do real-time training, total customization of the NLP, integrate directly the NLG with the context that the bot has from each conversation, easy integration, no need to deploy separately (RASA and Snips have to be deployed, with an effort and a cost), and the performance.

About the performance, having the NLP as a library means that you have not to call an API to resolve the information, so you don’t have a TTL. This is very important, because when sending the answer to the user, the chatbot can pass through several steps or business processes, so you can have your user waiting. To see the different times, here you have a little comparision of a bot done using node-nlp and the same one resolving the intents using LUIS

As fast as 1 millisecond to calculate the answer for the user!

Creating a bot with node-nlp

To create a bot using node-nlp, basically you create a basic bot, and use node-nlp to load the model/excel. It’s important that once you generate a model, don’t re-train if the excel has no modifications, because training is time consuming.

To modify the bot behaviour just edit the excel provided for the bot. The you can change the intents and the answers:

After that, you can retrain your bot with the new excel.

To test your bot before deploying it, I suggest to use Microsoft Bot Emulator, so you can test locally. The bot provided here executes this way:

Intents and dialogs

If you already developed some chatbot using Microsoft Bot Framework and LUIS, you know that the way the framework use the NLP is different to the shown in this example. As LUIS does not provides NLG, and return the intent but not the generated answer, Microsoft Bot Framework propose to work with triggers. Every single message received by the bot, is passed to the LUIS recognizer before processing it, and if LUIS recognize this message as an intent, and exists a dialog that is triggered by this intent, then instead of continue with the path of processing the message, the dialog is triggered. An example of code:

The node-nlp recognizer can be use as a replacement of the LUIS recognizer provided by Microsoft in the same way. Here you have the example of Microsoft, rewrited using node-nlp recognizer:

For this example we show the training directly by code, because is another way of training the node-nlp, and as it’s programmatical it allow to change the chatbot behaviour in real-time changing intents and responses. This bot working:

Hybrid

The problem comes when you have a complex chatbot, where you have at the same time contextualized intelligence and a complex dialog tree. In those case, the integration of the NLP is very difficult using Microsoft Bot Framework and LUIS. Great part of the hard work is caused because of the assumption of the Microsoft Bot Framework core about the recognizer, introducing in it’s core a logic for the calculating of the route messaging without thinking about how to integrate NLG for this routing.

But you can change the disambiguate route, so you can override the default behaviour provided by Microsoft Bot Framework with your own way of calculating the route based on the current status (dialog stack) and the session information. And this is what node-nlp does internally, adding the capability of having answers, routing in the Microsoft way, or even routing from the answer: if your answer starts by / then is automatically routed to the dialog with this name.

To achieve that, instead of calling the bot.recognizer(recognizer) call recognizer.setBot(bot, true, 0.7) that will tell the recognizer to set himself at the bot, activate the custom disambiguate route and that the threshold for the NLG is that the NLU returns an score of at least 0.7.

Full code of the hybrid chatbot:

This bot executing:

Conclusion

It’s very important to have a good routing on the chatbot, so both worlds, intelligent and clickbots, can live together to achieve the best user experience. The user should receive the benefits of the great UI from clickbots, but also with the intelligence of Natural Language Processing.

Also, it’s very important to separate the answer generation from the code, because this is part of the NLG. This allows to change completely the behaviour of your bot without having to develop and redeploy. Allowing to control even the dialog stack and flow from the NLG is a very powerful tool.

Repositories

All the example codes can be found at the github repo:

https://github.com/jseijas/bot-nlp

The full code of node-nlp can be found at its repo, and you’re more than welcome if you want to contribute!

https://github.com/axa-group/nlp.js

node-nlp is based on the Natural project, you can find it at:

https://github.com/NaturalNode/natural

Microsoft Botbuilder can be found at: