Social Engineering has, by far, been the most effective hacking method. This is due to the fact that we humans tend to give more information than what is required. There are countless stories in addition to The Art of Deception by Kevin Mitnick (my favorite); which shows how social engineers quickly gain the victim’s confidence to gather information. The information is later used for planning and executing hacks — sometimes hacking the enterprise!

Best practices to securely use Chatbots

In my previous blog, I highlighted practices that a developer should follow for a secure ChatBot, but consumers of ChatBot service cannot dictate these security measures and hence do not even control its enforcement.

Here is a set of practices that one should follow to protect themselves from being exploited by a cyber fraud harmful:

1. Ensure the genuineness of the ChatBot:

Do not engage with chatbots that pop out of your web-screens. Ensure the authenticity of the website that you are browsing and engage only with the embedded chatbots. Embedded chatbots are built (I trust the developers of trusted websites and brands) to encrypt the chat during transit and have the security posture in place, as discussed in my last blog.

2. Don’t fall prey to baits:

While engaging with ChatBots, do not click on URLs. These can potentially be baits that can insert malware in your system. Again, follow the above practice to ensure that you are with a genuine ChatBot.

3. Not voice but still a Vishing:

The ChatBot, although not over the phone line, but is still a conversating agent. Hence do no end up sharing personal and financial information. Going by the category of ChatBots in my last blog; only Enterprise Chatbots built for RPA will seek authentication details like a password, OTP, DOB, etc. These chatbots are accessed within a secure environment. For a web-based public ChatBot, we should not share personal and financial info. The simple warning is, “we do not seek any Bank details, OTP or Password, hence do not share these details with any of our agents.”

4. Something for Something (quid pro quo)

Don’t fall prey to freebies offered on the other side of the survey link posted by the ChatBot. The sheer excitement of gift blinds us as we progress through the surveys and end up sharing info, including passwords for unlocking the gift. The worse scenario is when this is forwarded in our closed circles. Educating users on such practices has its own set of challenges.

An employee who uses ChatBot can be made aware of such practices during InfoSec training;l but the same can’t be done with the customers. One certainly has the option to broadcast messages that educate the client on the safe use of ChatBots but with the identified risk of creating distrust in their minds. The effort of having an emphasizing medium for a company’s brand hence will fall through the cracks.

In such a scenario, an awareness campaign by cybersecurity groups is the best medium and the rest can ride on it.



Before continuing the next part i.e. concerned with the praise of Chatbots, here is a potion of quote to stir the mind juice:

How Chatbots can enhance security posture?

The chatbots are taking over major engagement activities to the business.

Since ChatBots aren’t such bad guys, you can turn the tides by using it to enhance the security measures. There are multiple use cases where an AI-driven Chatbot can be used to secure enterprises To list a few;

Identifying Fraudsters: A chatbot hosted by the bank can use AI/ML to understand the response patterns of the user and identify an imposter. Such Chatbots with voice and facial recognition features can easily alarm about fraudsters; thus protect the data of genuine customers from falling into the hands of crooks.

Data is safer with Chatbots: While dealing with banking professionals, information shared with them cannot be contained using digital security boundaries. On the other hand, if built securely (of course), the ChatBots communicate directly with the DB storing the customer data. Thus there is better control over who has access to the data; it certainly does not leave with the employee.

There are security solutions that provide cloud bots for auto-remediation of security loopholes. These can also be an interactive bot that follows a workflow of authentication; and requires approval prior to carrying out the change.

So, spread the awareness on secure development and usage of ChatBots.

Subscribe to the newsletter for experts’ opinions and thought leadership blogs.