Laurent5
ServiceNow Employee
ServiceNow Employee

The name is bot…

 As the hype surrounding chatbots doesn’t seem to fade (Gartner predicts twenty-five percent of customer service and support operations will be using some type of chatbot by 2020), more and more companies are deploying their new digital brand ambassador and I predict we’ll soon run out of possible names for one’s chatbot!

Although perceived as a must have these days, it is fair to say that these bots, whose primary use case is to improve and accelerate customer service, are met with varying degrees of success. Who hasn’t been greeted on a website by a chatbot keen to help, yet quickly proving frustratingly unable to do so?

As tempting as the idea is, propelled by buzz words such as A.I and Machine Learning, it is tempting to jump the gun and deploy a capability that could have an adverse effect on the customer experience.

 

Small talk

It may seem obvious but nonetheless critical: What do you want your chatbot to actually do?

It is all very well to deploy impressive ML capabilities and may be amusing your customer for a few (short) moments but it is important to keep the end in sight: In most use cases, your bot will provide a channel for a customer to enquire, access information and report an issue. Most importantly your customer will do so in the hope that it will get them to their desired outcome faster than using other channels available to them.

 

Do we understand each other?

 Here comes the technical bit. Although it is fashionable to class everything bot related as Machine Learning or A.I, the reality is that not all bots are. Broadly speaking, you need the following capabilities: The bot needs to understand the user’s intent, i.e what the question is or what they are trying to achieve. This can leverage simple key words matching or more advanced Natural Language Understanding (NLU) capabilities.

The Intent is then processed in order to extract the entities, i.e the specifics in order to deliver the best possible answer or course of action. An example would be “Is you London store open on Boxing Day?”. Here the intent is to find out about opening times, and the specifics are the location and a specific date.

Addressing the request can leverage Decision Tree technologies where a number of questions will be asked in order to navigate the user to the most relevant answer, or more advanced cognitive capabilities so that the bot can improve its understanding and ability to assist over time.

Now which you should adopt or start with will largely depend on the exact use cases you are trying to address. Some intents may be easily identified while others may require more complex explanations and therefore deeper cognitive abilities.

 

Conversation starter

You could decide to start with a handful of conversations (i.e a minimum viable product or sometimes referred as Short tail) addressing the most commonly asked question and that could be better handled by a bot in order to deflect calls. Assuming the Intent is relatively straight forward these could be structured as rule-based workflows.

You can then expand the bot’s capabilities by “teaching” it how to handle more complex queries. This then typically involves more complex cognitive capabilities which can be time consuming. While rule-based conversations can be built by business analysts, cognitive capabilities tend to rely on ML frameworks requiring more programming and data science skills, which you might need to resource. You will also need more training/testing iterations than with regular decision trees.

Even with the right resources in place it may take several months before you can leave your bot in the wild (a company I spoke to recently told me they have been prototyping their bot built on one of the main ML platforms for about a year…)

 

Smooth operator

Once you have identified your core use cases and platform you will build your bot on, you also need to consider a few other things.

The first is that you need to test and test again, and plan for the unknown as no matter how many times you test your bot, chances are a user will ask for a question you had not though about. So, your bot needs to be able to address these, usually in the form of forwarding to a live agent, as gracefully as possible. Equally, ensure that the flow enables navigation between topics for the smoothest experience.

Also plan for the unplanned. As much as you can refine the use cases in scope, unplanned events may through a spanner in the works. For instance, 3rd party events could cause a surge of enquiries to your bot. If you are a financial institution for example, events such as a hotel chain’s data breach, as seen recently in the news, may trigger a spike of conversations from anxious customers wanting to ensure their data is safe.

 

Finally, before rushing to add more conversation, ensure that you have the right metrics in place to measure whether your bot is indeed delivering the expected outcome. Usage and productivity metrics such as conversations by topics, completed or faulty conversations, transferred to agent etc. should be in place.

Now this article isn’t meant to be a comprehensive set of best practices but simply highlight some considerations before embarking on your journey. I will write more detailed articles on the topic in the future but in the meantime enjoy the conversation with your new friend the bot!

Version history
Last update:
‎01-23-2019 06:00 PM
Updated by: