It’s no secret that chatbot analytics is essential for improving a bot’s retention, engagement and several other metrics of success. But sometimes data can be a little opaque, or bottlenecks might go unnoticed if you’re not careful. When interpreting data still has you scratching your head about why users stop talking to your bot, there must be a better way—and there is. Prompting users to leave chatbot feedback in their own words when things go awry can save you a great deal of time when iterating your bot.

Bot makers must make it easy and seamless for users to provide detailed feedback on their experiences with a bot. Not only does this make it easier to assess a bot’s strengths and weaknesses, but it empowers users, too. They’ll appreciate being heard, and prompting feedback signals that you’re continually working to make the chatbot better. Let’s look at three big strategies botmakers use to prompt feedback from their users, and the pros and cons for each.

chatbot feedback

Introduce Response Ratings

To provide a simple, frictionless way for users to give chatbot feedback, simply introduce response ratings with every reply your bot gives. This allows users to flag a response with just a simple tap. When checking these ratings in your conversational analytics, you’ll be able to immediately see which responses provided a negative or positive experience for the user. The downside to this method is that it doesn’t provide specific feedback, and users will probably only rate responses that they find negative. That said, it remains a great way to quickly identify problematic responses from a feedback bot.

Google Assistant makes use of response ratings in the form of a thumbs-up or thumbs-down button. If a user gives the assistant a thumbs-down, it will ask for more detailed feedback. This is optional—the user doesn’t have to respond beyond the initial rating—and provides a nice and unobtrusive way to generate chatbot feedback to help developers.

Measure Intent in Chatbot Analytics

If you ask someone to perform a task and they do it incorrectly, you’ll most likely tell them what they did wrong and how it differed from what you asked. This isn’t any different with a chatbot; users aren’t shy about telling a bot when they’ve messed up, so you need to leverage that data in your conversational analytics.

With sentiment analysis, your chatbot analytics can automatically discover chatbot feedback without having to prompt the user at all. By analyzing the polarity of terms (whether a user’s vocabulary indicates a positive or negative experience) and tone, a feedback bot should understand whether it’s creating (or assuaging) a tense and frustrating experience. For this method, you’ll need to label responses or keywords as positive or negative feedback. Examples of negative feedback could be:

  • That’s not what I asked for.
  • That’s incorrect.

And phrases or words indicating positive feedback would include things like:

  • Thank you!
  • You’re a lifesaver.

To make best use of this chatbot feedback method, the bot should respond appropriately to the user’s tone. For example, it should thank the user for positive feedback or indicate that it’s always happy to help. Negative feedback should prompt the bot to give an apology and ask how its response could be improved.

After the feedback bot has funneled the user feedback to you, you can quickly gauge the user experience in their own words. Sentiment analysis can also present a general idea of what kind of responses from your bot trigger which moods, which is an invaluable addition to general conversational analytics.

Just Ask for Chatbot Feedback

If sentiment analysis requires more sophistication than you’re prepared to provide for a chatbot, consider simply asking for feedback. This method is best reserved for platforms that allow for menu systems and buttons, such as Facebook Messenger or Kik. After every response, you can provide a persistent feedback button that users can tap to share their thoughts.

Another way to prompt feedback is to ask when completing or confirming a task. For example:

Bot: You would like to transfer $400 from your checking account to your savings account. Is that correct?

User: No.

Bot: I’m sorry I couldn’t help you. Could you tell me what went wrong?

From there, the user’s explanation serves as chatbot feedback for improving the bot.

Responding to Feedback

Once the user has provided feedback on your bot, let them know they’ve been heard! Anticipate the fact that things will go wrong with your bot and provide proactive solutions to setting things right. If the user runs into a problem, for example, provide suggestions or menu options that will get them back on track after they have provided feedback. Alternatively, the bot might automatically connect them to a human assistant instead.

Positive feedback should also be rewarded, perhaps with a deal or offer. When a bot has proven its ability to help the user, it can make use of that earned trust to cross or upsell to them. For example, let’s say a user thanks a beauty bot for helping them find the perfect lipstick to complement their skin. From there, the bot might ask the user to try out some complimentary eyeshadow colors to match.

Botanalytics provides conversational analytics for your bot to make it hit the charts, sign up for free now!