How to test chatbots in Rasa framework?


This is the first step we should take before hitting the train button in Rasa. The command

rasa data validate

Test stories are part of the conversation-driven development I have before talked in the article entitled I’m learning all the time — conversation-driven development in a chatbot for Erasmus students with Rasa framework. Test stories are written in the form of exemplary conversations to check whether the bot will behave as expected.

rasa test

1. Case Study: Building Appointment Booking Chatbot

2. IBM Watson Assistant provides better intent classification than other commercial products according to published study

3. Testing Conversational AI

4. How intelligent and automated conversational systems are driving B2C revenue and growth.

You should make a habit of adding new stories as your chatbot grows and learns. It is not something you can do once in a while and then forget about it.

Your NLU file contains all the training examples your chatbot is trained with. In the real-life, the bot will encounter the examples which are not in the training set of course. That’s why you should split your data for testing to simulate that kind of situation.

rasa data split nlu
rasa test nlu--nlu train_test_split/test_data.yml
rasa test nlu--nlu data/nlu.yml--cross-validation

You can check your dialogue model on a group of test stories.

rasa test core --stories test_stories.yml --out results
rasa train core -c config_1.yml config_2.yml 
--out comparison_models --runs 3 --percentages 0 5 25 50 70 95

This content was originally published HERE


Be the first to comment on "How to test chatbots in Rasa framework?"

Leave a comment

Your email address will not be published.