Currently, if I want to train the Bot on 100 webpages of a website, I have to either upload each page link separately or I need a website to be crawled and deselect webpages which I donĀ“t want. Both is very time consuming. For big websites the bot ends up crawling thousands of webpages, so deselecting to get 100 is no option. It would be ideal, if the list of webpages could be imported and uploaded by a single excel file.
To leave a comment, please authenticate.